Sample records for simple descriptive statistics

  1. Applying Descriptive Statistics to Teaching the Regional Classification of Climate.

    ERIC Educational Resources Information Center

    Lindquist, Peter S.; Hammel, Daniel J.

    1998-01-01

    Describes an exercise for college and high school students that relates descriptive statistics to the regional climatic classification. The exercise introduces students to simple calculations of central tendency and dispersion, the construction and interpretation of scatterplots, and the definition of climatic regions. Forces students to engage…

  2. Statistical Tutorial | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data.  ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018.  The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.

  3. Quantitation & Case-Study-Driven Inquiry to Enhance Yeast Fermentation Studies

    ERIC Educational Resources Information Center

    Grammer, Robert T.

    2012-01-01

    We propose a procedure for the assay of fermentation in yeast in microcentrifuge tubes that is simple and rapid, permitting assay replicates, descriptive statistics, and the preparation of line graphs that indicate reproducibility. Using regression and simple derivatives to determine initial velocities, we suggest methods to compare the effects of…

  4. IT Control Deficiencies That Affect the Financial Reporting of Companies since the Enactment of the Sarbanes Oxley Act

    ERIC Educational Resources Information Center

    Harper, Roosevelt

    2014-01-01

    This research study examined the specific categories of IT control deficiencies and their related effects on financial reporting. The approach to this study was considered non-experimental, an approach sometimes called descriptive. Descriptive statistics are used to describe the basic features of the data in a study, providing simple summaries…

  5. Estimation of descriptive statistics for multiply censored water quality data

    USGS Publications Warehouse

    Helsel, Dennis R.; Cohn, Timothy A.

    1988-01-01

    This paper extends the work of Gilliom and Helsel (1986) on procedures for estimating descriptive statistics of water quality data that contain “less than” observations. Previously, procedures were evaluated when only one detection limit was present. Here we investigate the performance of estimators for data that have multiple detection limits. Probability plotting and maximum likelihood methods perform substantially better than simple substitution procedures now commonly in use. Therefore simple substitution procedures (e.g., substitution of the detection limit) should be avoided. Probability plotting methods are more robust than maximum likelihood methods to misspecification of the parent distribution and their use should be encouraged in the typical situation where the parent distribution is unknown. When utilized correctly, less than values frequently contain nearly as much information for estimating population moments and quantiles as would the same observations had the detection limit been below them.

  6. New American Undergraduates: Enrollment Trends and Age at Arrival of Immigrant and Second-Generation Students. Stats in Brief. NCES 2017-414

    ERIC Educational Resources Information Center

    Arbeit, Caren A.; Staklis, Sandra; Horn, Laura

    2016-01-01

    Statistics in Brief publications present descriptive data in tabular formats to provide useful information to a broad audience, including members of the general public. They address simple and topical issues and questions. This Statistics in Brief profiles the demographic and enrollment characteristics of undergraduates who are immigrants or…

  7. Evaluation of wind field statistics near and inside clouds using a coherent Doppler lidar

    NASA Astrophysics Data System (ADS)

    Lottman, Brian Todd

    1998-09-01

    This work proposes advanced techniques for measuring the spatial wind field statistics near and inside clouds using a vertically pointing solid state coherent Doppler lidar on a fixed ground based platform. The coherent Doppler lidar is an ideal instrument for high spatial and temporal resolution velocity estimates. The basic parameters of lidar are discussed, including a complete statistical description of the Doppler lidar signal. This description is extended to cases with simple functional forms for aerosol backscatter and velocity. An estimate for the mean velocity over a sensing volume is produced by estimating the mean spectra. There are many traditional spectral estimators, which are useful for conditions with slowly varying velocity and backscatter. A new class of estimators (novel) is introduced that produces reliable velocity estimates for conditions with large variations in aerosol backscatter and velocity with range, such as cloud conditions. Performance of traditional and novel estimators is computed for a variety of deterministic atmospheric conditions using computer simulated data. Wind field statistics are produced for actual data for a cloud deck, and for multi- layer clouds. Unique results include detection of possible spectral signatures for rain, estimates for the structure function inside a cloud deck, reliable velocity estimation techniques near and inside thin clouds, and estimates for simple wind field statistics between cloud layers.

  8. Use of Private Loans by Postsecondary Students: Selected Years 2003-04 through 2011-12. Stats in Brief. NCES 2017-420

    ERIC Educational Resources Information Center

    Woo, Jennie H.; Velez, Erin Dunlop

    2016-01-01

    Statistics in Brief publications present descriptive data in tabular formats to provide useful information to a broad audience, including members of the general public. They address simple and topical issues and questions. Using data from 2011-12, this Statistics in Brief updates and expands on a previous National Center for Education Statistics…

  9. Statistics of the geomagnetic secular variation for the past 5Ma

    NASA Technical Reports Server (NTRS)

    Constable, C. G.; Parker, R. L.

    1986-01-01

    A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.

  10. Statistics of the geomagnetic secular variation for the past 5 m.y

    NASA Technical Reports Server (NTRS)

    Constable, C. G.; Parker, R. L.

    1988-01-01

    A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.

  11. Professional Development, Promotion, and Pay Differences Between Women and Men in Educational Psychology.

    ERIC Educational Resources Information Center

    Ekstrom, Ruth B.

    A questionnaire about graduate school and professional experiences was completed by 235 white females, 10 minority females, 198 white males, and 12 minority males who hold the doctoral degree and are members of the American Psychological Association, Division of Educational Psychology. Tentative findings, based on simple descriptive statistics,…

  12. A new SAS program for behavioral analysis of Electrical Penetration Graph (EPG) data

    USDA-ARS?s Scientific Manuscript database

    A new program is introduced that uses SAS software to duplicate output of descriptive statistics from the Sarria Excel workbook for EPG waveform analysis. Not only are publishable means and standard errors or deviations output, the user also is guided through four relatively simple sub-programs for ...

  13. Statistical representation of multiphase flow

    NASA Astrophysics Data System (ADS)

    Subramaniam

    2000-11-01

    The relationship between two common statistical representations of multiphase flow, namely, the single--point Eulerian statistical representation of two--phase flow (D. A. Drew, Ann. Rev. Fluid Mech. (15), 1983), and the Lagrangian statistical representation of a spray using the dropet distribution function (F. A. Williams, Phys. Fluids 1 (6), 1958) is established for spherical dispersed--phase elements. This relationship is based on recent work which relates the droplet distribution function to single--droplet pdfs starting from a Liouville description of a spray (Subramaniam, Phys. Fluids 10 (12), 2000). The Eulerian representation, which is based on a random--field model of the flow, is shown to contain different statistical information from the Lagrangian representation, which is based on a point--process model. The two descriptions are shown to be simply related for spherical, monodisperse elements in statistically homogeneous two--phase flow, whereas such a simple relationship is precluded by the inclusion of polydispersity and statistical inhomogeneity. The common origin of these two representations is traced to a more fundamental statistical representation of a multiphase flow, whose concepts derive from a theory for dense sprays recently proposed by Edwards (Atomization and Sprays 10 (3--5), 2000). The issue of what constitutes a minimally complete statistical representation of a multiphase flow is resolved.

  14. Simple and Multivariate Relationships Between Spiritual Intelligence with General Health and Happiness.

    PubMed

    Amirian, Mohammad-Elyas; Fazilat-Pour, Masoud

    2016-08-01

    The present study examined simple and multivariate relationships of spiritual intelligence with general health and happiness. The employed method was descriptive and correlational. King's Spiritual Quotient scales, GHQ-28 and Oxford Happiness Inventory, are filled out by a sample consisted of 384 students, which were selected using stratified random sampling from the students of Shahid Bahonar University of Kerman. Data are subjected to descriptive and inferential statistics including correlations and multivariate regressions. Bivariate correlations support positive and significant predictive value of spiritual intelligence toward general health and happiness. Further analysis showed that among the Spiritual Intelligence' subscales, Existential Critical Thinking Predicted General Health and Happiness, reversely. In addition, happiness was positively predicted by generation of personal meaning and transcendental awareness. The findings are discussed in line with the previous studies and the relevant theoretical background.

  15. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    PubMed

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  16. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less

  17. Nursing education: contradictions and challenges of pedagogical practice.

    PubMed

    Pinto, Joelma Batista Tebaldi; Pepe, Alda Muniz

    2007-01-01

    This study deals with the nursing curriculum, pedagogical practice and education. Nowadays, this theme has taken up considerable space in academic debates. Thus, this study aimed to get empirical knowledge and provide an analytical description of the academic reality of nursing education at Santa Cruz State University in the undergraduate nursing course. This is a descriptive study, which may provide a new view of the problem, with careful observation, description, and exploration of the situation aspects, interpreting the reality, without interfering in it and, consequently, being open to new studies. Descriptive statistics with simple frequency and percentage calculation was applied. In summary, results indicate that professors and students have difficulties to evaluate the curriculum. In addition, the curriculum under study is characterized as a collection curriculum, with a pedagogical practice predominantly directed at the traditional model. Hence, nursing education still shows features of the biomedical-technical model.

  18. New statistical scission-point model to predict fission fragment observables

    NASA Astrophysics Data System (ADS)

    Lemaître, Jean-François; Panebianco, Stefano; Sida, Jean-Luc; Hilaire, Stéphane; Heinrich, Sophie

    2015-09-01

    The development of high performance computing facilities makes possible a massive production of nuclear data in a full microscopic framework. Taking advantage of the individual potential calculations of more than 7000 nuclei, a new statistical scission-point model, called SPY, has been developed. It gives access to the absolute available energy at the scission point, which allows the use of a parameter-free microcanonical statistical description to calculate the distributions and the mean values of all fission observables. SPY uses the richness of microscopy in a rather simple theoretical framework, without any parameter except the scission-point definition, to draw clear answers based on perfect knowledge of the ingredients involved in the model, with very limited computing cost.

  19. Neuronal Correlation Parameter and the Idea of Thermodynamic Entropy of an N-Body Gravitationally Bounded System.

    PubMed

    Haranas, Ioannis; Gkigkitzis, Ioannis; Kotsireas, Ilias; Austerlitz, Carlos

    2017-01-01

    Understanding how the brain encodes information and performs computation requires statistical and functional analysis. Given the complexity of the human brain, simple methods that facilitate the interpretation of statistical correlations among different brain regions can be very useful. In this report we introduce a numerical correlation measure that may serve the interpretation of correlational neuronal data, and may assist in the evaluation of different brain states. The description of the dynamical brain system, through a global numerical measure may indicate the presence of an action principle which may facilitate a application of physics principles in the study of the human brain and cognition.

  20. Common inputs in subthreshold membrane potential: The role of quiescent states in neuronal activity

    NASA Astrophysics Data System (ADS)

    Montangie, Lisandro; Montani, Fernando

    2018-06-01

    Experiments in certain regions of the cerebral cortex suggest that the spiking activity of neuronal populations is regulated by common non-Gaussian inputs across neurons. We model these deviations from random-walk processes with q -Gaussian distributions into simple threshold neurons, and investigate the scaling properties in large neural populations. We show that deviations from the Gaussian statistics provide a natural framework to regulate population statistics such as sparsity, entropy, and specific heat. This type of description allows us to provide an adequate strategy to explain the information encoding in the case of low neuronal activity and its possible implications on information transmission.

  1. Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes

    NASA Technical Reports Server (NTRS)

    Williams Colin P.

    1999-01-01

    Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.

  2. Brief communication: Skeletal biology past and present: Are we moving in the right direction?

    PubMed

    Hens, Samantha M; Godde, Kanya

    2008-10-01

    In 1982, Spencer's edited volume A History of American Physical Anthropology: 1930-1980 allowed numerous authors to document the state of our science, including a critical examination of skeletal biology. Some authors argued that the first 50 years of skeletal biology were characterized by the descriptive-historical approach with little regard for processual problems and that technological and statistical analyses were not rooted in theory. In an effort to determine whether Spencer's landmark volume impacted the field of skeletal biology, a content analysis was carried out for the American Journal of Physical Anthropology from 1980 to 2004. The percentage of skeletal biology articles is similar to that of previous decades. Analytical articles averaged only 32% and are defined by three criteria: statistical analysis, hypothesis testing, and broader explanatory context. However, when these criteria were scored individually, nearly 80% of papers attempted a broader theoretical explanation, 44% tested hypotheses, and 67% used advanced statistics, suggesting that the skeletal biology papers in the journal have an analytical emphasis. Considerable fluctuation exists between subfields; trends toward a more analytical approach are witnessed in the subfields of age/sex/stature/demography, skeletal maturation, anatomy, and nonhuman primate studies, which also increased in frequency, while paleontology and pathology were largely descriptive. Comparisons to the International Journal of Osteoarchaeology indicate that there are statistically significant differences between the two journals in terms of analytical criteria. These data indicate a positive shift in theoretical thinking, i.e., an attempt by most to explain processes rather than present a simple description of events.

  3. Pupil Size in Outdoor Environments

    DTIC Science & Technology

    2007-04-06

    studies. .........................19 Table 3: Descriptive statistics for pupils measured over luminance range. .........50 Table 4: N in each...strata for all pupil measurements..........................................50 Table 5: Descriptive statistics stratified against eye color...59 Table 6: Descriptive statistics stratified against gender. .....................................64 Table 7: Descriptive

  4. A Recommended Procedure for Estimating the Cosmic-Ray Spectral Parameter of a Simple Power Law With Applications to Detector Design

    NASA Technical Reports Server (NTRS)

    Howell, L. W.

    2001-01-01

    A simple power law model consisting of a single spectral index alpha-1 is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV. Two procedures for estimating alpha-1 the method of moments and maximum likelihood (ML), are developed and their statistical performance compared. It is concluded that the ML procedure attains the most desirable statistical properties and is hence the recommended statistical estimation procedure for estimating alpha-1. The ML procedure is then generalized for application to a set of real cosmic-ray data and thereby makes this approach applicable to existing cosmic-ray data sets. Several other important results, such as the relationship between collecting power and detector energy resolution, as well as inclusion of a non-Gaussian detector response function, are presented. These results have many practical benefits in the design phase of a cosmic-ray detector as they permit instrument developers to make important trade studies in design parameters as a function of one of the science objectives. This is particularly important for space-based detectors where physical parameters, such as dimension and weight, impose rigorous practical limits to the design envelope.

  5. Using Three-Dimensional Printing to Fabricate a Tubing Connector for Dilation and Evacuation.

    PubMed

    Stitely, Michael L; Paterson, Helen

    2016-02-01

    This is a proof-of-concept study to show that simple instrumentation problems encountered in surgery can be solved by fabricating devices using a three-dimensional printer. The device used in the study is a simple tubing connector fashioned to connect two segments of suction tubing used in a surgical procedure where no commercially available product for this use is available through our usual suppliers in New Zealand. A cylindrical tubing connector was designed using three-dimensional printing design software. The tubing connector was fabricated using the Makerbot Replicator 2X three-dimensional printer. The connector was used in 15 second-trimester dilation and evacuation procedures. Data forms were completed by the primary operating surgeon. Descriptive statistics were used with the expectation that the device would function as intended in all cases. The three-dimensional printed tubing connector functioned as intended in all 15 instances. Commercially available three-dimensional printing technology can be used to overcome simple instrumentation problems encountered during gynecologic surgical procedures.

  6. [Effect of somatostatin-14 in simple mechanical obstruction of the small intestine].

    PubMed

    Jimenez-Garcia, A; Ahmad Araji, O; Balongo Garcia, R; Nogales Munoz, A; Salguero Villadiego, M; Cantillana Martinez, J

    1994-02-01

    In order to investigate the properties of somatostatin-14 we studied an experimental model of simple mechanical and closed loop occlusion. Forty-eight New Zealand rabbits were assigned randomly to three groups of 16: group C (controls) was operated and treated with saline solution (4 cc/Kg/h); group A was operated and initially treated with saline solution and an equal dose of somatostatin-14 (3.5 micrograms/Kg/h; and group B was operated and treated in the same manner as group A, but later, 8 hours after the laparotomy. The animals were sacrificed 24 hours later; intestinal secretion was quantified, blood and intestinal fluid chemistries were performed and specimens of the intestine were prepared for histological examination. Descriptive statistical analysis of the results was performed with the ANOVA, a semi-quantitative test and the covariance test. Somatostatin-14 produced an improvement in the volume of intestinal secretion in the treated groups compared with the control group. The results were statistically significant in group B treated after an 8-hour delay: closed loop (ml): 6.40 +/- 1.12, 2.50 +/- 0.94, 1.85 +/- 0.83 and simple mechanical occlusion (ml): 175 +/- 33.05, 89.50 +/- 9.27, 57.18 +/- 21.23, p < 0.01 for groups C, A and B C, A and B respectively. Net secretion of Cl and Na ions was also improved, p < 0.01.(ABSTRACT TRUNCATED AT 250 WORDS)

  7. Toward an International Classification of Functioning, Disability and Health clinical data collection tool: the Italian experience of developing simple, intuitive descriptions of the Rehabilitation Set categories.

    PubMed

    Selb, Melissa; Gimigliano, Francesca; Prodinger, Birgit; Stucki, Gerold; Pestelli, Germano; Iocco, Maurizio; Boldrini, Paolo

    2017-04-01

    As part of international efforts to develop and implement national models including the specification of ICF-based clinical data collection tools, the Italian rehabilitation community initiated a project to develop simple, intuitive descriptions of the ICF Rehabilitation Set, highlighting the core concept of each category in user-friendly language. This paper outlines the Italian experience in developing simple, intuitive descriptions of the ICF Rehabilitation Set as an ICF-based clinical data collection tool for Italy. Consensus process. Expert conference. Multidisciplinary group of rehabilitation professionals. The first of a two-stage consensus process involved developing an initial proposal for simple, intuitive descriptions of each ICF Rehabilitation Set category based on descriptions generated in a similar process in China. Stage two involved a consensus conference. Divided into three working groups, participants discussed and voted (vote A) whether the initially proposed descriptions of each ICF Rehabilitation Set category was simple and intuitive enough for use in daily practice. Afterwards the categories with descriptions considered ambiguous i.e. not simple and intuitive enough, were divided among the working groups, who were asked to propose a new description for the allocated categories. These proposals were then voted (vote B) on in a plenary session. The last step of the consensus conference required each working group to develop a new proposal for each and the same categories with descriptions still considered ambiguous. Participants then voted (final vote) for which of the three proposed descriptions they preferred. Nineteen clinicians from diverse rehabilitation disciplines from various regions of Italy participated in the consensus process. Three ICF categories already achieved consensus in vote A, while 20 ICF categories were accepted in vote B. The remaining 7 categories were decided in the final vote. The findings were discussed in light of current efforts toward developing strategies for ICF implementation, specifically for the application of an ICF-based clinical data collection tool, not only for Italy but also for the rest of Europe. Promising as minimal standards for monitoring the impact of interventions and for standardized reporting of functioning as a relevant outcome in rehabilitation.

  8. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  9. Entry trajectory and atmosphere reconstruction methodologies for the Mars Exploration Rover mission

    NASA Astrophysics Data System (ADS)

    Desai, Prasun N.; Blanchard, Robert C.; Powell, Richard W.

    2004-02-01

    The Mars Exploration Rover (MER) mission will land two landers on the surface of Mars, arriving in January 2004. Both landers will deliver the rovers to the surface by decelerating with the aid of an aeroshell, a supersonic parachute, retro-rockets, and air bags for safely landing on the surface. The reconstruction of the MER descent trajectory and atmosphere profile will be performed for all the phases from hypersonic flight through landing. A description of multiple methodologies for the flight reconstruction is presented from simple parameter identification methods through a statistical Kalman filter approach.

  10. Invariant approach to the character classification

    NASA Astrophysics Data System (ADS)

    Šariri, Kristina; Demoli, Nazif

    2008-04-01

    Image moments analysis is a very useful tool which allows image description invariant to translation and rotation, scale change and some types of image distortions. The aim of this work was development of simple method for fast and reliable classification of characters by using Hu's and affine moment invariants. Measure of Eucleidean distance was used as a discrimination feature with statistical parameters estimated. The method was tested in classification of Times New Roman font letters as well as sets of the handwritten characters. It is shown that using all Hu's and three affine invariants as discrimination set improves recognition rate by 30%.

  11. Preferences of orthopedic surgeons for treating midshaft clavicle fracture in adults

    PubMed Central

    de Oliveira, Adilson Sanches; Roberto, Bruno Braga; Lenza, Mario; Pintan, Guilherme Figueiredo; Ejnisman, Benno; Schor, Breno; Carrera, Eduardo da Frota; Murachovsky, Joel

    2017-01-01

    ABSTRACT Objective To determine the current clinical practice in Latin America for treating midshaft clavicle fractures, including surgical and non-surgical approaches. Methods A cross-sectional study using a descriptive questionnaire. Shoulder and elbow surgeons from the Brazilian Society of Shoulder and Elbow Surgery and from the Latin American Society of Shoulder and Elbow were contacted and asked to complete a short questionnaire (SurveyMonkey®) on the management of midshaft fractures of the clavicle. Incomplete or inconsistent answers were excluded. Results The type of radiographic classification preferably used was related to description of fracture morphology, according to 41% of participants. Allman classification ranked second and was used by 24.1% of participants. As to indications for surgical treatment, only the indications with shortening and imminence of skin exposure were statistically significant. Conservative treatment was chosen in cortical contact. Regarding immobilization method, the simple sling was preferred, and treatment lasted from 4 to 6 weeks. Although the result was not statistically significant, the blocked plate was the preferred option in surgical cases. Conclusion The treatment of midshaft clavicle fractures in Latin America is in accordance with the current literature. PMID:29091151

  12. Quality of reporting statistics in two Indian pharmacology journals.

    PubMed

    Jaykaran; Yadav, Preeti

    2011-04-01

    To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.

  13. Entropy production in mesoscopic stochastic thermodynamics: nonequilibrium kinetic cycles driven by chemical potentials, temperatures, and mechanical forces

    NASA Astrophysics Data System (ADS)

    Qian, Hong; Kjelstrup, Signe; Kolomeisky, Anatoly B.; Bedeaux, Dick

    2016-04-01

    Nonequilibrium thermodynamics (NET) investigates processes in systems out of global equilibrium. On a mesoscopic level, it provides a statistical dynamic description of various complex phenomena such as chemical reactions, ion transport, diffusion, thermochemical, thermomechanical and mechanochemical fluxes. In the present review, we introduce a mesoscopic stochastic formulation of NET by analyzing entropy production in several simple examples. The fundamental role of nonequilibrium steady-state cycle kinetics is emphasized. The statistical mechanics of Onsager’s reciprocal relations in this context is elucidated. Chemomechanical, thermomechanical, and enzyme-catalyzed thermochemical energy transduction processes are discussed. It is argued that mesoscopic stochastic NET in phase space provides a rigorous mathematical basis of fundamental concepts needed for understanding complex processes in chemistry, physics and biology. This theory is also relevant for nanoscale technological advances.

  14. Patch-Based Generative Shape Model and MDL Model Selection for Statistical Analysis of Archipelagos

    NASA Astrophysics Data System (ADS)

    Ganz, Melanie; Nielsen, Mads; Brandt, Sami

    We propose a statistical generative shape model for archipelago-like structures. These kind of structures occur, for instance, in medical images, where our intention is to model the appearance and shapes of calcifications in x-ray radio graphs. The generative model is constructed by (1) learning a patch-based dictionary for possible shapes, (2) building up a time-homogeneous Markov model to model the neighbourhood correlations between the patches, and (3) automatic selection of the model complexity by the minimum description length principle. The generative shape model is proposed as a probability distribution of a binary image where the model is intended to facilitate sequential simulation. Our results show that a relatively simple model is able to generate structures visually similar to calcifications. Furthermore, we used the shape model as a shape prior in the statistical segmentation of calcifications, where the area overlap with the ground truth shapes improved significantly compared to the case where the prior was not used.

  15. Entropy generation in Gaussian quantum transformations: applying the replica method to continuous-variable quantum information theory

    NASA Astrophysics Data System (ADS)

    Gagatsos, Christos N.; Karanikas, Alexandros I.; Kordas, Georgios; Cerf, Nicolas J.

    2016-02-01

    In spite of their simple description in terms of rotations or symplectic transformations in phase space, quadratic Hamiltonians such as those modelling the most common Gaussian operations on bosonic modes remain poorly understood in terms of entropy production. For instance, determining the quantum entropy generated by a Bogoliubov transformation is notably a hard problem, with generally no known analytical solution, while it is vital to the characterisation of quantum communication via bosonic channels. Here we overcome this difficulty by adapting the replica method, a tool borrowed from statistical physics and quantum field theory. We exhibit a first application of this method to continuous-variable quantum information theory, where it enables accessing entropies in an optical parametric amplifier. As an illustration, we determine the entropy generated by amplifying a binary superposition of the vacuum and a Fock state, which yields a surprisingly simple, yet unknown analytical expression.

  16. The Argumentative Introduction in Oral Interpretation.

    ERIC Educational Resources Information Center

    Mills, Daniel; Gaer, David C.

    A study examined introductions used in competitive oral interpretation events. A total of 97 introductions (from four oral interpretation events at a nationally recognized Midwestern intercollegiate forensic tournament) were analyzed using four categories: Descriptive, Simple Theme, Descriptive and Simple Theme, and Argumentative Theme. Results…

  17. Construction of social value or utility-based health indices: the usefulness of factorial experimental design plans.

    PubMed

    Cadman, D; Goldsmith, C

    1986-01-01

    Global indices, which aggregate multiple health or function attributes into a single summary indicator, are useful measures in health research. Two key issues must be addressed in the initial stages of index construction from the universe of possible health and function attributes, which ones should be included in a new index? and how simple can the statistical model be to combine attributes into a single numeric index value? Factorial experimental designs were used in the initial stages of developing a function index for evaluating a program for the care of young handicapped children. Beginning with eight attributes judged important to the goals of the program by clinicians, social preference values for different function states were obtained from 32 parents of handicapped children and 32 members of the community. Using category rating methods each rater scored 16 written multi-attribute case descriptions which contained information about a child's status for all eight attributes. Either a good or poor level of each function attribute and age 3 or 5 years were described in each case. Thus, 2(8) = 256 different cases were rated. Two factorial design plans were selected and used to allocate case descriptions to raters. Analysis of variance determined that seven of the eight clinician selected attributes were required in a social value based index for handicapped children. Most importantly, the subsequent steps of index construction could be greatly simplified by the finding that a simple additive statistical model without complex attribute interaction terms was adequate for the index. We conclude that factorial experimental designs are an efficient, feasible and powerful tool for the initial stages of constructing a multi-attribute health index.

  18. Quality of reporting statistics in two Indian pharmacology journals

    PubMed Central

    Jaykaran; Yadav, Preeti

    2011-01-01

    Objective: To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. Materials and Methods: All original articles published since 2002 were downloaded from the journals’ (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Results: Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7–83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of “mean (SD)” or “mean ± SD.” Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6–38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Conclusion: Articles published in two Indian pharmacology journals are not devoid of statistical errors. PMID:21772766

  19. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  20. Measuring effective temperatures in a generalized Gibbs ensemble

    NASA Astrophysics Data System (ADS)

    Foini, Laura; Gambassi, Andrea; Konik, Robert; Cugliandolo, Leticia F.

    2017-05-01

    The local physical properties of an isolated quantum statistical system in the stationary state reached long after a quench are generically described by the Gibbs ensemble, which involves only its Hamiltonian and the temperature as a parameter. If the system is instead integrable, additional quantities conserved by the dynamics intervene in the description of the stationary state. The resulting generalized Gibbs ensemble involves a number of temperature-like parameters, the determination of which is practically difficult. Here we argue that in a number of simple models these parameters can be effectively determined by using fluctuation-dissipation relationships between response and correlation functions of natural observables, quantities which are accessible in experiments.

  1. Employee resourcing strategies and universities' corporate image: A survey dataset.

    PubMed

    Falola, Hezekiah Olubusayo; Oludayo, Olumuyiwa Akinrole; Olokundun, Maxwell Ayodele; Salau, Odunayo Paul; Ibidunni, Ayodotun Stephen; Igbinoba, Ebe

    2018-06-01

    The data examined the effect of employee resourcing strategies on corporate image. The data were generated from a total of 500 copies of questionnaire administered to the academic staff of the six (6) selected private Universities in Southwest, Nigeria, out of which four hundred and forty-three (443) were retrieved. Stratified and simple random sampling techniques were used to select the respondents for this study. Descriptive and Linear Regression, were used for the presentation of the data. Mean score was used as statistical tool of analysis. Therefore, the data presented in this article is made available to facilitate further and more comprehensive investigation on the subject matter.

  2. Descriptive statistics.

    PubMed

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  3. Science and Facebook: The same popularity law!

    PubMed

    Néda, Zoltán; Varga, Levente; Biró, Tamás S

    2017-01-01

    The distribution of scientific citations for publications selected with different rules (author, topic, institution, country, journal, etc…) collapse on a single curve if one plots the citations relative to their mean value. We find that the distribution of "shares" for the Facebook posts rescale in the same manner to the very same curve with scientific citations. This finding suggests that citations are subjected to the same growth mechanism with Facebook popularity measures, being influenced by a statistically similar social environment and selection mechanism. In a simple master-equation approach the exponential growth of the number of publications and a preferential selection mechanism leads to a Tsallis-Pareto distribution offering an excellent description for the observed statistics. Based on our model and on the data derived from PubMed we predict that according to the present trend the average citations per scientific publications exponentially relaxes to about 4.

  4. Theoretical predictor for candidate structure assignment from IMS data of biomolecule-related conformational space.

    PubMed

    Schenk, Emily R; Nau, Frederic; Fernandez-Lima, Francisco

    2015-06-01

    The ability to correlate experimental ion mobility data with candidate structures from theoretical modeling provides a powerful analytical and structural tool for the characterization of biomolecules. In the present paper, a theoretical workflow is described to generate and assign candidate structures for experimental trapped ion mobility and H/D exchange (HDX-TIMS-MS) data following molecular dynamics simulations and statistical filtering. The applicability of the theoretical predictor is illustrated for a peptide and protein example with multiple conformations and kinetic intermediates. The described methodology yields a low computational cost and a simple workflow by incorporating statistical filtering and molecular dynamics simulations. The workflow can be adapted to different IMS scenarios and CCS calculators for a more accurate description of the IMS experimental conditions. For the case of the HDX-TIMS-MS experiments, molecular dynamics in the "TIMS box" accounts for a better sampling of the molecular intermediates and local energy minima.

  5. Science and Facebook: The same popularity law!

    PubMed Central

    Varga, Levente; Biró, Tamás S.

    2017-01-01

    The distribution of scientific citations for publications selected with different rules (author, topic, institution, country, journal, etc…) collapse on a single curve if one plots the citations relative to their mean value. We find that the distribution of “shares” for the Facebook posts rescale in the same manner to the very same curve with scientific citations. This finding suggests that citations are subjected to the same growth mechanism with Facebook popularity measures, being influenced by a statistically similar social environment and selection mechanism. In a simple master-equation approach the exponential growth of the number of publications and a preferential selection mechanism leads to a Tsallis-Pareto distribution offering an excellent description for the observed statistics. Based on our model and on the data derived from PubMed we predict that according to the present trend the average citations per scientific publications exponentially relaxes to about 4. PMID:28678796

  6. Colloquium: Hierarchy of scales in language dynamics

    NASA Astrophysics Data System (ADS)

    Blythe, Richard A.

    2015-11-01

    Methods and insights from statistical physics are finding an increasing variety of applications where one seeks to understand the emergent properties of a complex interacting system. One such area concerns the dynamics of language at a variety of levels of description, from the behaviour of individual agents learning simple artificial languages from each other, up to changes in the structure of languages shared by large groups of speakers over historical timescales. In this Colloquium, we survey a hierarchy of scales at which language and linguistic behaviour can be described, along with the main progress in understanding that has been made at each of them - much of which has come from the statistical physics community. We argue that future developments may arise by linking the different levels of the hierarchy together in a more coherent fashion, in particular where this allows more effective use of rich empirical data sets.

  7. [Prevalence of postmenopausal simple ovarian cyst diagnosed by ultrasound].

    PubMed

    Luján Irastorza, Jesús E; Hernández Marín, Imelda; Figueroa Preciado, Gudelia; Ayala, Aquiles R

    2006-10-01

    The high-resolution ultrasound has taken to discover small ovary cysts in postmenopausal asymptomatic women who in another situation would not been detected; these cysts frequently disappear spontaneously and rarely develop cancer; however, they are treated aggressively. To know the prevalence, evolution and treatment of ovary simple cysts in the postmenopausal women in our department, since in our country there are not studies that had analyzed these data. We made a retrospective and descriptive study in the Service of Biology of the Human Reproduction of the Hospital Juarez de Mexico, in a four-year period (2000-2003) that included 1,010 postmenopausal women. The statistical analysis was made using the SPSS software program with which we obtained descriptive measurements in localization, dispersion and by a graphic analysis. We found a simple cysts prevalence of 8.2% (n = 83); the average of age at the diagnosis time was 50.76 years with a standard deviation of 5.55; the cysts diameter was between 0.614 to 12,883 cm with a mean and standard deviation of 2.542 and 1.91 cm respectively; in 27.71% of the cases (n = 23), the cysts disappear spontaneously in the follow up of 3 to 36 month (mean of 14.1). Surgery was indicated in 16.46% (n = 13), by increase in the size of the cyst in 9 patients (11.64%) and by changes in morphology from simple to complex in 4 (4.82%). Tumor like markers were made only to 37 patients (44.57%), which were in normal ranks; no carcinoma was found in this group. The prevalence of ovary simple cysts was similar to the reported in literature. Risk of cancer of these cysts is extremely low when a suitable evaluation is made, a reason why the conservative treatment is suggested when these are simple cysts lesser than 5cm with Ca-125 levels within normal ranks. We recommend a follow up every 3-6 months by Doppler color ultrasound and tumor like markers for five years.

  8. 40 CFR 60.103a - Design, equipment, work practice or operational standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Description and simple process flow diagram showing the interconnection of the following components of the... rate. (iv) Description and simple process flow diagram showing all gas lines (including flare, purge... which lines are monitored and identify on the process flow diagram the location and type of each monitor...

  9. 40 CFR 60.103a - Design, equipment, work practice or operational standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) Description and simple process flow diagram showing the interconnection of the following components of the... rate. (iv) Description and simple process flow diagram showing all gas lines (including flare, purge... which lines are monitored and identify on the process flow diagram the location and type of each monitor...

  10. Characterizing microstructural features of biomedical samples by statistical analysis of Mueller matrix images

    NASA Astrophysics Data System (ADS)

    He, Honghui; Dong, Yang; Zhou, Jialing; Ma, Hui

    2017-03-01

    As one of the salient features of light, polarization contains abundant structural and optical information of media. Recently, as a comprehensive description of polarization property, the Mueller matrix polarimetry has been applied to various biomedical studies such as cancerous tissues detections. In previous works, it has been found that the structural information encoded in the 2D Mueller matrix images can be presented by other transformed parameters with more explicit relationship to certain microstructural features. In this paper, we present a statistical analyzing method to transform the 2D Mueller matrix images into frequency distribution histograms (FDHs) and their central moments to reveal the dominant structural features of samples quantitatively. The experimental results of porcine heart, intestine, stomach, and liver tissues demonstrate that the transformation parameters and central moments based on the statistical analysis of Mueller matrix elements have simple relationships to the dominant microstructural properties of biomedical samples, including the density and orientation of fibrous structures, the depolarization power, diattenuation and absorption abilities. It is shown in this paper that the statistical analysis of 2D images of Mueller matrix elements may provide quantitative or semi-quantitative criteria for biomedical diagnosis.

  11. On representing the prognostic value of continuous gene expression biomarkers with the restricted mean survival curve.

    PubMed

    Eng, Kevin H; Schiller, Emily; Morrell, Kayla

    2015-11-03

    Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.

  12. Quantitative methods used in Australian health promotion research: a review of publications from 1992-2002.

    PubMed

    Smith, Ben J; Zehle, Katharina; Bauman, Adrian E; Chau, Josephine; Hawkshaw, Barbara; Frost, Steven; Thomas, Margaret

    2006-04-01

    This study examined the use of quantitative methods in Australian health promotion research in order to identify methodological trends and priorities for strengthening the evidence base for health promotion. Australian health promotion articles were identified by hand searching publications from 1992-2002 in six journals: Health Promotion Journal of Australia, Australian and New Zealand journal of Public Health, Health Promotion International, Health Education Research, Health Education and Behavior and the American Journal of Health Promotion. The study designs and statistical methods used in articles presenting quantitative research were recorded. 591 (57.7%) of the 1,025 articles used quantitative methods. Cross-sectional designs were used in the majority (54.3%) of studies with pre- and post-test (14.6%) and post-test only (9.5%) the next most common designs. Bivariate statistical methods were used in 45.9% of papers, multivariate methods in 27.1% and simple numbers and proportions in 25.4%. Few studies used higher-level statistical techniques. While most studies used quantitative methods, the majority were descriptive in nature. The study designs and statistical methods used provided limited scope for demonstrating intervention effects or understanding the determinants of change.

  13. Effect of a stress management program on subjects with neck pain: A pilot randomized controlled trial.

    PubMed

    Metikaridis, T Damianos; Hadjipavlou, Alexander; Artemiadis, Artemios; Chrousos, George; Darviri, Christina

    2016-05-20

    Studies have shown that stress is implicated in the cause of neck pain (NP). The purpose of this study is to examine the effect of a simple, zero cost stress management program on patients suffering from NP. This study is a parallel-type randomized clinical study. People suffering from chronic non-specific NP were chosen randomly to participate in an eight week duration program of stress management (N= 28) (including diaphragmatic breathing, progressive muscle relaxation) or in a no intervention control condition (N= 25). Self-report measures were used for the evaluation of various variables at the beginning and at the end of the eight-week monitoring period. Descriptive and inferential statistic methods were used for the statistical analysis. At the end of the monitoring period, the intervention group showed a statistically significant reduction of stress and anxiety (p= 0.03, p= 0.01), report of stress related symptoms (p= 0.003), percentage of disability due to NP (p= 0.000) and NP intensity (p= 0.002). At the same time, daily routine satisfaction levels were elevated (p= 0.019). No statistically significant difference was observed in cortisol measurements. Stress management has positive effects on NP patients.

  14. Accurate Identification of MCI Patients via Enriched White-Matter Connectivity Network

    NASA Astrophysics Data System (ADS)

    Wee, Chong-Yaw; Yap, Pew-Thian; Brownyke, Jeffery N.; Potter, Guy G.; Steffens, David C.; Welsh-Bohmer, Kathleen; Wang, Lihong; Shen, Dinggang

    Mild cognitive impairment (MCI), often a prodromal phase of Alzheimer's disease (AD), is frequently considered to be a good target for early diagnosis and therapeutic interventions of AD. Recent emergence of reliable network characterization techniques have made understanding neurological disorders at a whole brain connectivity level possible. Accordingly, we propose a network-based multivariate classification algorithm, using a collection of measures derived from white-matter (WM) connectivity networks, to accurately identify MCI patients from normal controls. An enriched description of WM connections, utilizing six physiological parameters, i.e., fiber penetration count, fractional anisotropy (FA), mean diffusivity (MD), and principal diffusivities (λ 1, λ 2, λ 3), results in six connectivity networks for each subject to account for the connection topology and the biophysical properties of the connections. Upon parcellating the brain into 90 regions-of-interest (ROIs), the average statistics of each ROI in relation to the remaining ROIs are extracted as features for classification. These features are then sieved to select the most discriminant subset of features for building an MCI classifier via support vector machines (SVMs). Cross-validation results indicate better diagnostic power of the proposed enriched WM connection description than simple description with any single physiological parameter.

  15. Learning in Structured Connectionist Networks

    DTIC Science & Technology

    1988-04-01

    the structure is too rigid and learning too difficult for cognitive modeling. Two algorithms for learning simple, feature-based concept descriptions...and learning too difficult for cognitive model- ing. Two algorithms for learning simple, feature-based concept descriptions were also implemented. The...Term Goals Recent progress in connectionist research has been encouraging; networks have success- fully modeled human performance for various cognitive

  16. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  17. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  18. Descriptive statistics: the specification of statistical measures and their presentation in tables and graphs. Part 7 of a series on evaluation of scientific publications.

    PubMed

    Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria

    2009-09-01

    Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.

  19. A statistical method to estimate low-energy hadronic cross sections

    NASA Astrophysics Data System (ADS)

    Balassa, Gábor; Kovács, Péter; Wolf, György

    2018-02-01

    In this article we propose a model based on the Statistical Bootstrap approach to estimate the cross sections of different hadronic reactions up to a few GeV in c.m.s. energy. The method is based on the idea, when two particles collide a so-called fireball is formed, which after a short time period decays statistically into a specific final state. To calculate the probabilities we use a phase space description extended with quark combinatorial factors and the possibility of more than one fireball formation. In a few simple cases the probability of a specific final state can be calculated analytically, where we show that the model is able to reproduce the ratios of the considered cross sections. We also show that the model is able to describe proton-antiproton annihilation at rest. In the latter case we used a numerical method to calculate the more complicated final state probabilities. Additionally, we examined the formation of strange and charmed mesons as well, where we used existing data to fit the relevant model parameters.

  20. Extended q -Gaussian and q -exponential distributions from gamma random variables

    NASA Astrophysics Data System (ADS)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  1. Analyzing Hidden Semantics in Social Bookmarking of Open Educational Resources

    NASA Astrophysics Data System (ADS)

    Minguillón, Julià

    Web 2.0 services such as social bookmarking allow users to manage and share the links they find interesting, adding their own tags for describing them. This is especially interesting in the field of open educational resources, as delicious is a simple way to bridge the institutional point of view (i.e. learning object repositories) with the individual one (i.e. personal collections), thus promoting the discovering and sharing of such resources by other users. In this paper we propose a methodology for analyzing such tags in order to discover hidden semantics (i.e. taxonomies and vocabularies) that can be used to improve descriptions of learning objects and make learning object repositories more visible and discoverable. We propose the use of a simple statistical analysis tool such as principal component analysis to discover which tags create clusters that can be semantically interpreted. We will compare the obtained results with a collection of resources related to open educational resources, in order to better understand the real needs of people searching for open educational resources.

  2. Energy flow in non-equilibrium conformal field theory

    NASA Astrophysics Data System (ADS)

    Bernard, Denis; Doyon, Benjamin

    2012-09-01

    We study the energy current and its fluctuations in quantum gapless 1d systems far from equilibrium modeled by conformal field theory, where two separated halves are prepared at distinct temperatures and glued together at a point contact. We prove that these systems converge towards steady states, and give a general description of such non-equilibrium steady states in terms of quantum field theory data. We compute the large deviation function, also called the full counting statistics, of energy transfer through the contact. These are universal and satisfy fluctuation relations. We provide a simple representation of these quantum fluctuations in terms of classical Poisson processes whose intensities are proportional to Boltzmann weights.

  3. Parametric control in coupled fermionic oscillators

    NASA Astrophysics Data System (ADS)

    Ghosh, Arnab

    2014-10-01

    A simple model of parametric coupling between two fermionic oscillators is considered. Statistical properties, in particular the mean and variance of quanta for a single mode, are described by means of a time-dependent reduced density operator for the system and the associated P function. The density operator for fermionic fields as introduced by Cahill and Glauber [K. E. Cahill and R. J. Glauber, Phys. Rev. A 59, 1538 (1999), 10.1103/PhysRevA.59.1538] thus can be shown to provide a quantum mechanical description of the fields closely resembling their bosonic counterpart. In doing so, special emphasis is given to population trapping, and quantum control over the states of the system.

  4. Measuring effective temperatures in a generalized Gibbs ensemble

    DOE PAGES

    Foini, Laura; Gambassi, Andrea; Konik, Robert; ...

    2017-05-11

    The local physical properties of an isolated quantum statistical system in the stationary state reached long after a quench are generically described by the Gibbs ensemble, which involves only its Hamiltonian and the temperature as a parameter. Additional quantities conserved by the dynamics intervene in the description of the stationary state, if the system is instead integrable. The resulting generalized Gibbs ensemble involves a number of temperature-like parameters, the determination of which is practically difficult. We argue that in a number of simple models these parameters can be effectively determined by using fluctuation-dissipation relationships between response and correlation functions ofmore » natural observables, quantities which are accessible in experiments.« less

  5. Coagulation-Fragmentation Model for Animal Group-Size Statistics

    NASA Astrophysics Data System (ADS)

    Degond, Pierre; Liu, Jian-Guo; Pego, Robert L.

    2017-04-01

    We study coagulation-fragmentation equations inspired by a simple model proposed in fisheries science to explain data for the size distribution of schools of pelagic fish. Although the equations lack detailed balance and admit no H-theorem, we are able to develop a rather complete description of equilibrium profiles and large-time behavior, based on recent developments in complex function theory for Bernstein and Pick functions. In the large-population continuum limit, a scaling-invariant regime is reached in which all equilibria are determined by a single scaling profile. This universal profile exhibits power-law behavior crossing over from exponent -2/3 for small size to -3/2 for large size, with an exponential cutoff.

  6. Quantum formalism for classical statistics

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  7. Biostatistics primer: part I.

    PubMed

    Overholser, Brian R; Sowinski, Kevin M

    2007-12-01

    Biostatistics is the application of statistics to biologic data. The field of statistics can be broken down into 2 fundamental parts: descriptive and inferential. Descriptive statistics are commonly used to categorize, display, and summarize data. Inferential statistics can be used to make predictions based on a sample obtained from a population or some large body of information. It is these inferences that are used to test specific research hypotheses. This 2-part review will outline important features of descriptive and inferential statistics as they apply to commonly conducted research studies in the biomedical literature. Part 1 in this issue will discuss fundamental topics of statistics and data analysis. Additionally, some of the most commonly used statistical tests found in the biomedical literature will be reviewed in Part 2 in the February 2008 issue.

  8. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    PubMed

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  9. Comparative Research of Navy Voluntary Education at Operational Commands

    DTIC Science & Technology

    2017-03-01

    return on investment, ROI, logistic regression, multivariate analysis, descriptive statistics, Markov, time-series, linear programming 15. NUMBER...21  B.  DESCRIPTIVE STATISTICS TABLES ...............................................25  C.  PRIVACY CONSIDERATIONS...THIS PAGE INTENTIONALLY LEFT BLANK xi LIST OF TABLES Table 1.  Variables and Descriptions . Adapted from NETC (2016). .......................21

  10. Structured Natural-Language Descriptions for Semantic Content Retrieval of Visual Materials.

    ERIC Educational Resources Information Center

    Tam, A. M.; Leung, C. H. C.

    2001-01-01

    Proposes a structure for natural language descriptions of the semantic content of visual materials that requires descriptions to be (modified) keywords, phrases, or simple sentences, with components that are grammatical relations common to many languages. This structure makes it easy to implement a collection's descriptions as a relational…

  11. Analysis of Professional and Pre-Accession Characteristics and Junior Naval Officer Performance

    DTIC Science & Technology

    2018-03-01

    REVIEW .............................................5 A. NAVY PERFORMANCE EVALUATION SYSTEM ............................5 B. PROFESSIONAL...17 A. DATA DESCRIPTION ...........................................................................17 B. SUMMARY...STATISTICS ......................................................................24 C. DESCRIPTIVE STATISTICS

  12. Descriptive Statistical Techniques for Librarians. 2nd Edition.

    ERIC Educational Resources Information Center

    Hafner, Arthur W.

    A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…

  13. Knowledge and Acceptability of Human Papillomavirus Vaccination among Women Attending the Gynaecological Outpatient Clinics of a University Teaching Hospital in Lagos, Nigeria.

    PubMed

    Okunade, Kehinde S; Sunmonu, Oyebola; Osanyin, Gbemisola E; Oluwole, Ayodeji A

    2017-01-01

    This study was aimed at determining the knowledge and acceptability of HPV vaccine among women attending the gynaecology clinics of the Lagos University Teaching Hospital (LUTH). This was a descriptive cross-sectional study involving 148 consecutively selected women attending the gynaecology clinic of LUTH. Relevant information was obtained from these women using an interviewer-administered questionnaire. The data was analysed and then presented by simple descriptive statistics using tables and charts. Chi-square statistics were used to test the association between the sociodemographical variables and acceptance of HPV vaccination. All significance values were reported at P < 0.05. The mean age of the respondents was 35.7 ± 9.7 years. The study showed that 36.5% of the respondents had heard about HPV infection while only 18.9% had knowledge about the existence of HPV vaccines. Overall, 81.8% of the respondents accepted that the vaccines could be administered to their teenage girls with the level of education of the mothers being the major determinant of their acceptability ( P = 0.013). Awareness of HPV infections and existence of HPV vaccines is low. However, the acceptance of HPV vaccines is generally high. Efforts should be made to increase the awareness about cervical cancer, its aetiologies, and prevention via HPV vaccination.

  14. Intensity of interprofessional collaboration among intensive care nurses at a tertiary hospital.

    PubMed

    Serrano-Gemes, G; Rich-Ruiz, M

    To measure the intensity of interprofessional collaboration (IPC) in nurses of an intensive care unit (ICU) at a tertiary hospital, to check differences between the dimensions of the Intensity of Interprofessional Collaboration Questionnaire, and to identify the influence of personal variables. A cross-sectional descriptive study was conducted with 63 intensive care nurses selected by simple random sampling. Explanatory variables: age, sex, years of experience in nursing, years of experience in critical care, workday type and work shift type; variable of outcome: IPC. The IPC was measured by: Intensity of Interprofessional Collaboration Questionnaire. Descriptive and bivariate statistical analysis (IPC and its dimensions with explanatory variables). 73.8% were women, with a mean age of 46.54 (±6.076) years. The average years experience in nursing and critical care was 23.03 (±6.24) and 14.25 (±8.532), respectively. 77% had a full time and 95.1% had a rotating shift. 62.3% obtained average IPC values. Statistically significant differences were found (P<.05) between IPC (overall score) and overall assessment with years of experience in critical care. This study shows average levels of IPC; the nurses with less experience in critical care obtained higher IPC and overall assessment scores. Copyright © 2016 Sociedad Española de Enfermería Intensiva y Unidades Coronarias (SEEIUC). Publicado por Elsevier España, S.L.U. All rights reserved.

  15. The Brandeis Dice Problem and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    van Enk, Steven J.

    2014-11-01

    Jaynes invented the Brandeis Dice Problem as a simple illustration of the MaxEnt (Maximum Entropy) procedure that he had demonstrated to work so well in Statistical Mechanics. I construct here two alternative solutions to his toy problem. One, like Jaynes' solution, uses MaxEnt and yields an analog of the canonical ensemble, but at a different level of description. The other uses Bayesian updating and yields an analog of the micro-canonical ensemble. Both, unlike Jaynes' solution, yield error bars, whose operational merits I discuss. These two alternative solutions are not equivalent for the original Brandeis Dice Problem, but become so in what must, therefore, count as the analog of the thermodynamic limit, M-sided dice with M → ∞. Whereas the mathematical analogies between the dice problem and Stat Mech are quite close, there are physical properties that the former lacks but that are crucial to the workings of the latter. Stat Mech is more than just MaxEnt.

  16. Epidemiologic programs for computers and calculators. A microcomputer program for multiple logistic regression by unconditional and conditional maximum likelihood methods.

    PubMed

    Campos-Filho, N; Franco, E L

    1989-02-01

    A frequent procedure in matched case-control studies is to report results from the multivariate unmatched analyses if they do not differ substantially from the ones obtained after conditioning on the matching variables. Although conceptually simple, this rule requires that an extensive series of logistic regression models be evaluated by both the conditional and unconditional maximum likelihood methods. Most computer programs for logistic regression employ only one maximum likelihood method, which requires that the analyses be performed in separate steps. This paper describes a Pascal microcomputer (IBM PC) program that performs multiple logistic regression by both maximum likelihood estimation methods, which obviates the need for switching between programs to obtain relative risk estimates from both matched and unmatched analyses. The program calculates most standard statistics and allows factoring of categorical or continuous variables by two distinct methods of contrast. A built-in, descriptive statistics option allows the user to inspect the distribution of cases and controls across categories of any given variable.

  17. Detection and Estimation of an Optical Image by Photon-Counting Techniques. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Wang, Lily Lee

    1973-01-01

    Statistical description of a photoelectric detector is given. The photosensitive surface of the detector is divided into many small areas, and the moment generating function of the photo-counting statistic is derived for large time-bandwidth product. The detection of a specified optical image in the presence of the background light by using the hypothesis test is discussed. The ideal detector based on the likelihood ratio from a set of numbers of photoelectrons ejected from many small areas of the photosensitive surface is studied and compared with the threshold detector and a simple detector which is based on the likelihood ratio by counting the total number of photoelectrons from a finite area of the surface. The intensity of the image is assumed to be Gaussian distributed spatially against the uniformly distributed background light. The numerical approximation by the method of steepest descent is used, and the calculations of the reliabilities for the detectors are carried out by a digital computer.

  18. Comparing Serum Follicle-Stimulating Hormone (FSH) Level with Vaginal PH in Women with Menopausal Symptoms.

    PubMed

    Vahidroodsari, Fatemeh; Ayati, Seddigheh; Yousefi, Zohreh; Saeed, Shohreh

    2010-01-01

    Despite the important implication for women's health and reproduction, very few studies have focused on vaginal PH for menopausal diagnosis. Recent studies have suggested vaginal PH as a simple, noninvasive and inexpensive method for this purpose. The aim of this study is to compare serum FSH level with vaginal PH in menopause. This is a cross-sectional, descriptive study, conducted on 103 women (aged 31-95 yrs) with menopausal symptoms who were referred to the Menopausal Clinic at Ghaem Hospital during 2006. Vaginal pH was measured using pH meter strips and serum FSH levels were measured using immunoassay methods. The data was analyzed using SPSS software (version 11.5) and results were evaluated statistically by the Chi-square and Kappa tests. p≤0.05 was considered statistically significant. According to this study, in the absence of vaginal infection, the average vaginal pH in these 103 menopausal women was 5.33±0.53. If the menopausal hallmark was considered as vaginal pH>4.5, and serum FSH as ≥20 mIU/ml, then the sensitivity of vaginal pH for menopausal diagnosis was 97%. The mean of FSH levels in this population was 80.79 mIU/ml. Vaginal pH is a simple, accurate, and cost effective tool that can be suggested as a suitable alternative to serum FSH measurement for the diagnosis of menopause.

  19. Descriptive and inferential statistical methods used in burns research.

    PubMed

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.

  20. Weighted statistical parameters for irregularly sampled time series

    NASA Astrophysics Data System (ADS)

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.

  1. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  2. The Greyhound Strike: Using a Labor Dispute to Teach Descriptive Statistics.

    ERIC Educational Resources Information Center

    Shatz, Mark A.

    1985-01-01

    A simulation exercise of a labor-management dispute is used to teach psychology students some of the basics of descriptive statistics. Using comparable data sets generated by the instructor, students work in small groups to develop a statistical presentation that supports their particular position in the dispute. (Author/RM)

  3. Green Function Calculations of Properties for the Magnetocaloric Layered Structures Based Upon FeMnAsP

    NASA Astrophysics Data System (ADS)

    Schilling, Osvaldo F.

    2016-11-01

    The alternating Fe-Mn layered structures of the compounds FeMnAsxP1-x display properties which have been demonstrated experimentally as very promising as far as commercial applications of the magnetocaloric effect are concerned. However, the theoretical literature on this and other families of magnetocaloric compounds still adopts simple molecular-field models in the description of important statistical mechanical properties like the entropy variation that accompanies applied isothermal magnetic field cycling, as well as the temperature variation following adiabatic magnetic field cycles. In the present paper, a random phase approximation Green function theoretical treatment is applied to such structures. The advantages of such approach are well known since the details of the crystal structure are easily incorporated in the model, as well as a precise description of correlations between neighbor spins can be obtained. We focus on a simple one-exchange parameter Heisenberg model, and the observed first-order phase transitions are reproduced by the introduction of a biquadratic term in the Hamiltonian whose origin is related both to the magnetoelastic coupling with the phonon spectrum in these compounds as well as with the values of spins in the Fe and Mn ions. The calculations are compared with experimental magnetocaloric data for the FeMnAsxP1-x compounds. In particular, the magnetic field dependence for the entropy variation at the transition temperature predicted from the Landau theory of continuous phase transitions is reproduced even in the case of discontinuous transitions.

  4. New approach in the quantum statistical parton distribution

    NASA Astrophysics Data System (ADS)

    Sohaily, Sozha; Vaziri (Khamedi), Mohammad

    2017-12-01

    An attempt to find simple parton distribution functions (PDFs) based on quantum statistical approach is presented. The PDFs described by the statistical model have very interesting physical properties which help to understand the structure of partons. The longitudinal portion of distribution functions are given by applying the maximum entropy principle. An interesting and simple approach to determine the statistical variables exactly without fitting and fixing parameters is surveyed. Analytic expressions of the x-dependent PDFs are obtained in the whole x region [0, 1], and the computed distributions are consistent with the experimental observations. The agreement with experimental data, gives a robust confirm of our simple presented statistical model.

  5. Survival of mutations arising during invasions.

    PubMed

    Miller, Judith R

    2010-03-01

    When a neutral mutation arises in an invading population, it quickly either dies out or 'surfs', i.e. it comes to occupy almost all the habitat available at its time of origin. Beneficial mutations can also surf, as can deleterious mutations over finite time spans. We develop descriptive statistical models that quantify the relationship between the probability that a mutation will surf and demographic parameters for a cellular automaton model of surfing. We also provide a simple analytic model that performs well at predicting the probability of surfing for neutral and beneficial mutations in one dimension. The results suggest that factors - possibly including even abiotic factors - that promote invasion success may also increase the probability of surfing and associated adaptive genetic change, conditioned on such success.

  6. Auto- and Crosscorrelograms for the Spike Response of Leaky Integrate-and-Fire Neurons with Slow Synapses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreno-Bote, Ruben; Parga, Nestor; Center for Theoretical Neuroscience, Center for Neurobiology and Behavior, Columbia University, New York 10032-2695

    2006-01-20

    An analytical description of the response properties of simple but realistic neuron models in the presence of noise is still lacking. We determine completely up to the second order the firing statistics of a single and a pair of leaky integrate-and-fire neurons receiving some common slowly filtered white noise. In particular, the auto- and cross-correlation functions of the output spike trains of pairs of cells are obtained from an improvement of the adiabatic approximation introduced previously by Moreno-Bote and Parga [Phys. Rev. Lett. 92, 028102 (2004)]. These two functions define the firing variability and firing synchronization between neurons, and aremore » of much importance for understanding neuron communication.« less

  7. Geostatistics applied to gas reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meunier, G.; Coulomb, C.; Laille, J.P.

    1989-09-01

    The spatial distribution of many of the physical parameters connected with a gas reservoir is of primary interest to both engineers and geologists throughout the study, development, and operation of a field. It is therefore desirable for the distribution to be capable of statistical interpretation, to have a simple graphical representation, and to allow data to be entered from either two- or three-dimensional grids. To satisfy these needs while dealing with the geographical variables, new methods have been developed under the name geostatistics. This paper describes briefly the theory of geostatistics and its most recent improvements for the specific problemmore » of subsurface description. The external-drift technique has been emphasized in particular, and in addition, four case studies related to gas reservoirs are presented.« less

  8. Data survey on the effect of product features on competitive advantage of selected firms in Nigeria.

    PubMed

    Olokundun, Maxwell; Iyiola, Oladele; Ibidunni, Stephen; Falola, Hezekiah; Salau, Odunayo; Amaihian, Augusta; Peter, Fred; Borishade, Taiye

    2018-06-01

    The main objective of this study was to present a data article that investigates the effect product features on firm's competitive advantage. Few studies have examined how the features of a product could help in driving the competitive advantage of a firm. Descriptive research method was used. Statistical Package for Social Sciences (SPSS 22) was engaged for analysis of one hundred and fifty (150) valid questionnaire which were completed by small business owners registered under small and medium scale enterprises development of Nigeria (SMEDAN). Stratified and simple random sampling techniques were employed; reliability and validity procedures were also confirmed. The field data set is made publicly available to enable critical or extended analysis.

  9. Sports and nutritional supplement use in USMC recruits: a pilot study.

    PubMed

    Young, Colin R; Stephens, Mark B

    2009-02-01

    This is a pilot study to describe patterns of nutritional supplement use by recruits entering the U.S. Marine Corps (USMC). A survey asking USMC recruits to self-report nutritional supplement use was administered upon entry to basic training. Simple descriptive statistics and prevalence ratios were used to describe patterns of supplement use. The response rate was 65%. Half of respondents reported nutritional sports supplement use at some point before boot camp. The five most commonly supplements were: protein powder (43%), postrecovery workout drinks (36%), vitamin supplements (26%), creatine (26%), and nitric oxide (16%). Nutritional supplement use is frequent among recruits entering the USMC. The impact of supplement use on recruit fitness, training, and injury rates is not known.

  10. How does the past of a soccer match influence its future? Concepts and statistical analysis.

    PubMed

    Heuer, Andreas; Rubner, Oliver

    2012-01-01

    Scoring goals in a soccer match can be interpreted as a stochastic process. In the most simple description of a soccer match one assumes that scoring goals follows from independent rate processes of both teams. This would imply simple Poissonian and Markovian behavior. Deviations from this behavior would imply that the previous course of the match has an impact on the present match behavior. Here a general framework for the identification of deviations from this behavior is presented. For this endeavor it is essential to formulate an a priori estimate of the expected number of goals per team in a specific match. This can be done based on our previous work on the estimation of team strengths. Furthermore, the well-known general increase of the number of the goals in the course of a soccer match has to be removed by appropriate normalization. In general, three different types of deviations from a simple rate process can exist. First, the goal rate may depend on the exact time of the previous goals. Second, it may be influenced by the time passed since the previous goal and, third, it may reflect the present score. We show that the Poissonian scenario is fulfilled quite well for the German Bundesliga. However, a detailed analysis reveals significant deviations for the second and third aspect. Dramatic effects are observed if the away team leads by one or two goals in the final part of the match. This analysis allows one to identify generic features about soccer matches and to learn about the hidden complexities behind scoring goals. Among others the reason for the fact that the number of draws is larger than statistically expected can be identified.

  11. How Does the Past of a Soccer Match Influence Its Future? Concepts and Statistical Analysis

    PubMed Central

    Heuer, Andreas; Rubner, Oliver

    2012-01-01

    Scoring goals in a soccer match can be interpreted as a stochastic process. In the most simple description of a soccer match one assumes that scoring goals follows from independent rate processes of both teams. This would imply simple Poissonian and Markovian behavior. Deviations from this behavior would imply that the previous course of the match has an impact on the present match behavior. Here a general framework for the identification of deviations from this behavior is presented. For this endeavor it is essential to formulate an a priori estimate of the expected number of goals per team in a specific match. This can be done based on our previous work on the estimation of team strengths. Furthermore, the well-known general increase of the number of the goals in the course of a soccer match has to be removed by appropriate normalization. In general, three different types of deviations from a simple rate process can exist. First, the goal rate may depend on the exact time of the previous goals. Second, it may be influenced by the time passed since the previous goal and, third, it may reflect the present score. We show that the Poissonian scenario is fulfilled quite well for the German Bundesliga. However, a detailed analysis reveals significant deviations for the second and third aspect. Dramatic effects are observed if the away team leads by one or two goals in the final part of the match. This analysis allows one to identify generic features about soccer matches and to learn about the hidden complexities behind scoring goals. Among others the reason for the fact that the number of draws is larger than statistically expected can be identified. PMID:23226200

  12. Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information

    NASA Technical Reports Server (NTRS)

    Howell, L. W., Jr.

    2003-01-01

    A simple power law model consisting of a single spectral index, sigma(sub 2), is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index sigma(sub 2) greater than sigma(sub 1) above E(sub k). The maximum likelihood (ML) procedure was developed for estimating the single parameter sigma(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (Pl) consistency (asymptotically unbiased), (P2) efficiency (asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only be ascertained by calculating the CRB for an assumed energy spectrum- detector response function combination, which can be quite formidable in practice. However, the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are stained in practice are investigated.

  13. Rear-End Crashes: Problem Size Assessment And Statistical Description

    DOT National Transportation Integrated Search

    1993-05-01

    KEYWORDS : RESEARCH AND DEVELOPMENT OR R&D, ADVANCED VEHICLE CONTROL & SAFETY SYSTEMS OR AVCSS, INTELLIGENT VEHICLE INITIATIVE OR IVI : THIS DOCUMENT PRESENTS PROBLEM SIZE ASSESSMENTS AND STATISTICAL CRASH DESCRIPTION FOR REAR-END CRASHES, INC...

  14. Statistics in the pharmacy literature.

    PubMed

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  15. Bootstrap Methods: A Very Leisurely Look.

    ERIC Educational Resources Information Center

    Hinkle, Dennis E.; Winstead, Wayland H.

    The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…

  16. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    PubMed Central

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  17. Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment

    NASA Astrophysics Data System (ADS)

    Brietzke, G. B.; Hainzl, S.; Zöller, G.

    2012-04-01

    As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).

  18. Simple Parametric Model for Airfoil Shape Description

    NASA Astrophysics Data System (ADS)

    Ziemkiewicz, David

    2017-12-01

    We show a simple, analytic equation describing a class of two-dimensional shapes well suited for representation of aircraft airfoil profiles. Our goal was to create a description characterized by a small number of parameters with easily understandable meaning, providing a tool to alter the shape with optimization procedures as well as manual tweaks by the designer. The generated shapes are well suited for numerical analysis with 2D flow solving software such as XFOIL.

  19. A Case Study on Teaching the Topic "Experimental Unit" and How It Is Presented in Advanced Placement Statistics Textbooks

    ERIC Educational Resources Information Center

    Perrett, Jamis J.

    2012-01-01

    This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…

  20. Accession Medical Standards Analysis and Research Activity (AMSARA) 2014, Annual Report, and four Supplemental Applicants and Accessions Tables for: Army, Air Force, Marine, and Navy

    DTIC Science & Technology

    2016-02-02

    23 Descriptive Statistics for Enlisted Service Applicants and Accessions...33 Summary Statistics for Applicants and Accessions for Enlisted Service ..................................... 36 Applicants and...utilization among Soldiers screened using TAPAS. Section 2 of this report includes the descriptive statistics AMSARA compiles and publishes

  1. Statistical methods used in the public health literature and implications for training of public health professionals

    PubMed Central

    Hayat, Matthew J.; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L.

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals. PMID:28591190

  2. Statistical methods used in the public health literature and implications for training of public health professionals.

    PubMed

    Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L

    2017-01-01

    Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.

  3. Assessment of health literacy of municipal employees in Shemiranat, Iran.

    PubMed

    Solhi, Mahnaz; Jormand, Hanieh

    2017-12-01

    Health literacy is one of the major determinants of health promotion among individuals and within society. The present study is aimed to determine the health literacy status of office employees in Shemiranat using the native instruments of health literacy for Iranian adults (HELIA). The present descriptive-analytical cross-sectional study was done in 2016-17. It was conducted on 360 office employees in Shemiranat. The samples were selected using a multi-stage simple random sampling method. Data collection tools in this study included HELIA questionnaire. The data were imported into SPSS v.18 software and then analyzed using descriptive statistical indices (mean, SD, number, and percentage) and inferential statistics (Chi-square, Pearson's correlation coefficient, Spearman's correlation coefficient, and Kruskal-Wallis test). Written informed consent was obtained from the employees participating in the study and they were assured about confidentiality. Also, they were informed that participation was voluntary. The mean and standard deviation of the total health literacy score among the studied individuals was 125.99±16.01. The mean score of health literacy in the areas of reading (15.36±2.89) and evaluation (5.01±2.8) among the studied individuals was lower than other dimensions of health literacy. Based on the Chi-square test, there was a statistically significant relationship between health literacy and education level, occupational rank, work place, and work experience (p=0.0001 in all the cases). The individuals with medium and good levels of health literacy acquired most of their health-related information through the Internet, friends, relatives, physicians, and health staff. Health literacy status was not sufficient among the studied staff. Thus, it is recommended to perform promotional interventions in order to improve the health literacy status and its dimensions among these staff.

  4. A Classical Phase Space Framework For the Description of Supercooled Liquids and an Apparent Universal Viscosity Collapse

    NASA Astrophysics Data System (ADS)

    Weingartner, Nicholas; Pueblo, Chris; Nogueira, Flavio; Kelton, Kenneth; Nussinov, Zohar

    A fundamental understanding of the phenomenology of the metastable supercooled liquid state remains elusive. Two of the most pressing questions in this field are how to describe the temperature dependence of the viscosity, and determine whether or not the dynamical behaviors are universal. To address these questions, we have devised a simple first-principles classical phase space description of supercooled liquids that (along with a complementary quantum approach) predicts a unique functional form for the viscosity which relies on only a single parameter. We tested this form for 45 liquids of all types and fragilities, and have demonstrated that it provides a statistically significant fit to all liquids. Additionally, by scaling the viscosity of all studied liquids using the single parameter, we have observed a complete collapse of the data of all 45 liquids to a single scaling curve over 16 decades, suggesting an underlying universality in the dynamics of supercooled liquids. In this talk I will outline the basic approach of our model, as well as demonstrate the quality of the model performance and collapse of the data.

  5. C-statistic fitting routines: User's manual and reference guide

    NASA Technical Reports Server (NTRS)

    Nousek, John A.; Farwana, Vida

    1991-01-01

    The computer program is discussed which can read several input files and provide a best set of values for the functions provided by the user, using either C-statistic or the chi(exp 2) statistic method. The program consists of one main routine and several functions and subroutines. Detail descriptions of each function and subroutine is presented. A brief description of the C-statistic and the reason for its application is also presented.

  6. Validating Future Force Performance Measures (Army Class): Concluding Analyses

    DTIC Science & Technology

    2016-06-01

    32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness

  7. Sample size considerations for clinical research studies in nuclear cardiology.

    PubMed

    Chiuzan, Cody; West, Erin A; Duong, Jimmy; Cheung, Ken Y K; Einstein, Andrew J

    2015-12-01

    Sample size calculation is an important element of research design that investigators need to consider in the planning stage of the study. Funding agencies and research review panels request a power analysis, for example, to determine the minimum number of subjects needed for an experiment to be informative. Calculating the right sample size is crucial to gaining accurate information and ensures that research resources are used efficiently and ethically. The simple question "How many subjects do I need?" does not always have a simple answer. Before calculating the sample size requirements, a researcher must address several aspects, such as purpose of the research (descriptive or comparative), type of samples (one or more groups), and data being collected (continuous or categorical). In this article, we describe some of the most frequent methods for calculating the sample size with examples from nuclear cardiology research, including for t tests, analysis of variance (ANOVA), non-parametric tests, correlation, Chi-squared tests, and survival analysis. For the ease of implementation, several examples are also illustrated via user-friendly free statistical software.

  8. Dynamics of market structure driven by the degree of consumer’s rationality

    NASA Astrophysics Data System (ADS)

    Yanagita, Tatsuo; Onozaki, Tamotsu

    2010-03-01

    We study a simple model of market share dynamics with boundedly rational consumers and firms interacting with each other. As the number of consumers is large, we employ a statistical description to represent firms’ distribution of consumer share, which is characterized by a single parameter representing how rationally the mass of consumers pursue higher utility. As the boundedly rational firm does not know the shape of demand function it faces, it revises production and price so as to raise its profit with the aid of a simple reinforcement learning rule. Simulation results show that (1) three phases of market structure, i.e. the uniform share phase, the oligopolistic phase, and the monopolistic phase, appear depending upon how rational consumers are, and (2) in an oligopolistic phase, the market share distribution of firms follows Zipf’s law and the growth-rate distribution of firms follows Gibrat’s law, and (3) an oligopolistic phase is the best state of market in terms of consumers’ utility but brings the minimum profit to the firms because of severe competition based on the moderate rationality of consumers.

  9. Predicting Subsequent Myopia in Initially Pilot-Qualified USAFA Cadets.

    DTIC Science & Technology

    1985-12-27

    Refraction Measurement 14 Accesion For . 4.0 RESULTS NTIS CRA&I 15 4.1 Descriptive Statistics DTIC TAB 0 15i ~ ~Unannoutwced [ 4.2 Predictive Statistics ...mentioned), and three were missing a status. The data of the subject who was commissionable were dropped from the statistical analyses. Of the 91...relatively equal numbers of participants from all classes will become obvious ’’" - within the results. J 4.1 Descriptive Statistics In the original plan

  10. Evidence-based orthodontics. Current statistical trends in published articles in one journal.

    PubMed

    Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J

    2010-09-01

    To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).

  11. Job Satisfaction DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    focuses more specifically on satisfaction with the job. Included is a review of the 4.0 description and items, followed by the proposed modifications to...the factor. The DEOCS 4.0 description provided for job satisfaction is “the perception of personal fulfillment in a specific vocation, and sense of...piloting items on the DEOCS; (4) examining the descriptive statistics, exploratory factor analysis results, and aggregation statistics; and (5

  12. Crop identification technology assessment for remote sensing. (CITARS) Volume 9: Statistical analysis of results

    NASA Technical Reports Server (NTRS)

    Davis, B. J.; Feiveson, A. H.

    1975-01-01

    Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.

  13. Prison Radicalization: The New Extremist Training Grounds?

    DTIC Science & Technology

    2007-09-01

    distributing and collecting survey data , and the data analysis. The analytical methodology includes descriptive and inferential statistical methods, in... statistical analysis of the responses to identify significant correlations and relationships. B. SURVEY DATA COLLECTION To effectively access a...Q18, Q19, Q20, and Q21. Due to the exploratory nature of this small survey, data analyses were confined mostly to descriptive statistics and

  14. Experimenting with Impacts in a Conceptual Physics or Descriptive Astronomy Laboratory

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2016-01-01

    What follows is a description of the procedure for and results of a simple experiment on the formation of impact craters designed for the laboratory portions of lower mathematical-level general education science courses such as conceptual physics or descriptive astronomy. The experiment provides necessary experience with data collection and…

  15. A Systematic Study of Simple Combinatorial Configurations.

    ERIC Educational Resources Information Center

    Dubois, Jean-Guy

    1984-01-01

    A classification of the simple combinatorial configurations which correspond to various cases of distribution and ordering of objects into boxes is given (in French). Concrete descriptions, structured relations, translations, and formalizations are discussed. (MNS)

  16. Health belief model and reasoned action theory in predicting water saving behaviors in yazd, iran.

    PubMed

    Morowatisharifabad, Mohammad Ali; Momayyezi, Mahdieh; Ghaneian, Mohammad Taghi

    2012-01-01

    People's behaviors and intentions about healthy behaviors depend on their beliefs, values, and knowledge about the issue. Various models of health education are used in deter¬mining predictors of different healthy behaviors but their efficacy in cultural behaviors, such as water saving behaviors, are not studied. The study was conducted to explain water saving beha¬viors in Yazd, Iran on the basis of Health Belief Model and Reasoned Action Theory. The cross-sectional study used random cluster sampling to recruit 200 heads of households to collect the data. The survey questionnaire was tested for its content validity and reliability. Analysis of data included descriptive statistics, simple correlation, hierarchical multiple regression. Simple correlations between water saving behaviors and Reasoned Action Theory and Health Belief Model constructs were statistically significant. Health Belief Model and Reasoned Action Theory constructs explained 20.80% and 8.40% of the variances in water saving beha-viors, respectively. Perceived barriers were the strongest Predictor. Additionally, there was a sta¬tistically positive correlation between water saving behaviors and intention. In designing interventions aimed at water waste prevention, barriers of water saving behaviors should be addressed first, followed by people's attitude towards water saving. Health Belief Model constructs, with the exception of perceived severity and benefits, is more powerful than is Reasoned Action Theory in predicting water saving behavior and may be used as a framework for educational interventions aimed at improving water saving behaviors.

  17. Health Belief Model and Reasoned Action Theory in Predicting Water Saving Behaviors in Yazd, Iran

    PubMed Central

    Morowatisharifabad, Mohammad Ali; Momayyezi, Mahdieh; Ghaneian, Mohammad Taghi

    2012-01-01

    Background: People's behaviors and intentions about healthy behaviors depend on their beliefs, values, and knowledge about the issue. Various models of health education are used in deter¬mining predictors of different healthy behaviors but their efficacy in cultural behaviors, such as water saving behaviors, are not studied. The study was conducted to explain water saving beha¬viors in Yazd, Iran on the basis of Health Belief Model and Reasoned Action Theory. Methods: The cross-sectional study used random cluster sampling to recruit 200 heads of households to collect the data. The survey questionnaire was tested for its content validity and reliability. Analysis of data included descriptive statistics, simple correlation, hierarchical multiple regression. Results: Simple correlations between water saving behaviors and Reasoned Action Theory and Health Belief Model constructs were statistically significant. Health Belief Model and Reasoned Action Theory constructs explained 20.80% and 8.40% of the variances in water saving beha-viors, respectively. Perceived barriers were the strongest Predictor. Additionally, there was a sta¬tistically positive correlation between water saving behaviors and intention. Conclusion: In designing interventions aimed at water waste prevention, barriers of water saving behaviors should be addressed first, followed by people's attitude towards water saving. Health Belief Model constructs, with the exception of perceived severity and benefits, is more powerful than is Reasoned Action Theory in predicting water saving behavior and may be used as a framework for educational interventions aimed at improving water saving behaviors. PMID:24688927

  18. Survival of mutations arising during invasions

    PubMed Central

    Miller, Judith R

    2010-01-01

    When a neutral mutation arises in an invading population, it quickly either dies out or ‘surfs’, i.e. it comes to occupy almost all the habitat available at its time of origin. Beneficial mutations can also surf, as can deleterious mutations over finite time spans. We develop descriptive statistical models that quantify the relationship between the probability that a mutation will surf and demographic parameters for a cellular automaton model of surfing. We also provide a simple analytic model that performs well at predicting the probability of surfing for neutral and beneficial mutations in one dimension. The results suggest that factors – possibly including even abiotic factors – that promote invasion success may also increase the probability of surfing and associated adaptive genetic change, conditioned on such success. PMID:25567912

  19. Efficacy of Exclusive Lingual Nerve Block versus Conventional Inferior Alveolar Nerve Block in Achieving Lingual Soft-tissue Anesthesia.

    PubMed

    Balasubramanian, Sasikala; Paneerselvam, Elavenil; Guruprasad, T; Pathumai, M; Abraham, Simin; Krishnakumar Raja, V B

    2017-01-01

    The aim of this randomized clinical trial was to assess the efficacy of exclusive lingual nerve block (LNB) in achieving selective lingual soft-tissue anesthesia in comparison with conventional inferior alveolar nerve block (IANB). A total of 200 patients indicated for the extraction of lower premolars were recruited for the study. The samples were allocated by randomization into control and study groups. Lingual soft-tissue anesthesia was achieved by IANB and exclusive LNB in the control and study group, respectively. The primary outcome variable studied was anesthesia of ipsilateral lingual mucoperiosteum, floor of mouth and tongue. The secondary variables assessed were (1) taste sensation immediately following administration of local anesthesia and (2) mouth opening and lingual nerve paresthesia on the first postoperative day. Data analysis for descriptive and inferential statistics was performed using SPSS (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY: IBM Corp. Released 2013) and a P < 0.05 was considered statistically significant. In comparison with the control group, the study group (LNB) showed statistically significant anesthesia of the lingual gingiva of incisors, molars, anterior floor of the mouth, and anterior tongue. Exclusive LNB is superior to IAN nerve block in achieving selective anesthesia of lingual soft tissues. It is technically simple and associated with minimal complications as compared to IAN block.

  20. Efficacy of Exclusive Lingual Nerve Block versus Conventional Inferior Alveolar Nerve Block in Achieving Lingual Soft-tissue Anesthesia

    PubMed Central

    Balasubramanian, Sasikala; Paneerselvam, Elavenil; Guruprasad, T; Pathumai, M; Abraham, Simin; Krishnakumar Raja, V. B.

    2017-01-01

    Objective: The aim of this randomized clinical trial was to assess the efficacy of exclusive lingual nerve block (LNB) in achieving selective lingual soft-tissue anesthesia in comparison with conventional inferior alveolar nerve block (IANB). Materials and Methods: A total of 200 patients indicated for the extraction of lower premolars were recruited for the study. The samples were allocated by randomization into control and study groups. Lingual soft-tissue anesthesia was achieved by IANB and exclusive LNB in the control and study group, respectively. The primary outcome variable studied was anesthesia of ipsilateral lingual mucoperiosteum, floor of mouth and tongue. The secondary variables assessed were (1) taste sensation immediately following administration of local anesthesia and (2) mouth opening and lingual nerve paresthesia on the first postoperative day. Results: Data analysis for descriptive and inferential statistics was performed using SPSS (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY: IBM Corp. Released 2013) and a P < 0.05 was considered statistically significant. In comparison with the control group, the study group (LNB) showed statistically significant anesthesia of the lingual gingiva of incisors, molars, anterior floor of the mouth, and anterior tongue. Conclusion: Exclusive LNB is superior to IAN nerve block in achieving selective anesthesia of lingual soft tissues. It is technically simple and associated with minimal complications as compared to IAN block. PMID:29264294

  1. The use of mechanistic descriptions of algal growth and zooplankton grazing in an estuarine eutrophication model

    NASA Astrophysics Data System (ADS)

    Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.

    2003-03-01

    A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.

  2. Paediatric Refractive Errors in an Eye Clinic in Osogbo, Nigeria.

    PubMed

    Michaeline, Isawumi; Sheriff, Agboola; Bimbo, Ayegoro

    2016-03-01

    Paediatric ophthalmology is an emerging subspecialty in Nigeria and as such there is paucity of data on refractive errors in the country. This study set out to determine the pattern of refractive errors in children attending an eye clinic in South West Nigeria. A descriptive study of 180 consecutive subjects seen over a 2-year period. Presenting complaints, presenting visual acuity (PVA), age and sex were recorded. Clinical examination of the anterior and posterior segments of the eyes, extraocular muscle assessment and refraction were done. The types of refractive errors and their grades were determined. Corrected VA was obtained. Data was analysed using descriptive statistics in proportions, chi square with p value <0.05. The age range of subjects was between 3 and 16 years with mean age = 11.7 and SD = 0.51; with males making up 33.9%.The commonest presenting complaint was blurring of distant vision (40%), presenting visual acuity 6/9 (33.9%), normal vision constituted >75.0%, visual impairment20% and low vision 23.3%. Low grade spherical and cylindrical errors occurred most frequently (35.6% and 59.9% respectively). Regular astigmatism was significantly more common, P <0.001. The commonest diagnosis was simple myopic astigmatism (41.1%). Four cases of strabismus were seen. Simple spherical and cylindrical errors were the commonest types of refractive errors seen. Visual impairment and low vision occurred and could be a cause of absenteeism from school. Low-cost spectacle production or dispensing unit and health education are advocated for the prevention of visual impairment in a hospital set-up.

  3. Longitudinal Assessment of Self-Reported Recent Back Pain and Combat Deployment in the Millennium Cohort Study

    DTIC Science & Technology

    2016-11-15

    participants who were followed for the development of back pain for an average of 3.9 years. Methods. Descriptive statistics and longitudinal...health, military personnel, occupational health, outcome assessment, statistics, survey methodology . Level of Evidence: 3 Spine 2016;41:1754–1763ack...based on the National Health and Nutrition Examination Survey.21 Statistical Analysis Descriptive and univariate analyses compared character- istics

  4. Rebuilding Government Legitimacy in Post-conflict Societies: Case Studies of Nepal and Afghanistan

    DTIC Science & Technology

    2015-09-09

    administered via the verbal scales due to reduced time spent explaining the visual show cards. Statistical results corresponded with observations from...a three-step strategy for dealing with item non-response. First, basic descriptive statistics are calculated to determine the extent of item...descriptive statistics for all items in the survey), however this section of the report highlights just some of the findings. Thus, the results

  5. Evaluating statistical cloud schemes: What can we gain from ground-based remote sensing?

    NASA Astrophysics Data System (ADS)

    Grützun, V.; Quaas, J.; Morcrette, C. J.; Ament, F.

    2013-09-01

    Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based remote sensing such as lidar, microwave, and radar to evaluate prognostic distribution moments using the "perfect model approach." This means that we employ a high-resolution weather model as virtual reality and retrieve full three-dimensional atmospheric quantities and virtual ground-based observations. We then use statistics from the virtual observation to validate the modeled 3-D statistics. Since the data are entirely consistent, any discrepancy occurring is due to the method. Focusing on total water mixing ratio, we find that the mean ratio can be evaluated decently but that it strongly depends on the meteorological conditions as to whether the variance and skewness are reliable. Using some simple schematic description of different synoptic conditions, we show how statistics obtained from point or line measurements can be poor at representing the full three-dimensional distribution of water in the atmosphere. We argue that a careful analysis of measurement data and detailed knowledge of the meteorological situation is necessary to judge whether we can use the data for an evaluation of higher moments of the humidity distribution used by a statistical cloud scheme.

  6. Effects of inhibitory neurons on the quorum percolation model and dynamical extension with the Brette-Gerstner model

    NASA Astrophysics Data System (ADS)

    Fardet, Tanguy; Bottani, Samuel; Métens, Stéphane; Monceau, Pascal

    2018-06-01

    The Quorum Percolation model (QP) has been designed in the context of neurobiology to describe the initiation of activity bursts occurring in neuronal cultures from the point of view of statistical physics rather than from a dynamical synchronization approach. This paper aims at investigating an extension of the original QP model by taking into account the presence of inhibitory neurons in the cultures (IQP model). The first part of this paper is focused on an equivalence between the presence of inhibitory neurons and a reduction of the network connectivity. By relying on a simple topological argument, we show that the mean activation behavior of networks containing a fraction η of inhibitory neurons can be mapped onto purely excitatory networks with an appropriately modified wiring, provided that η remains in the range usually observed in neuronal cultures, namely η ⪅ 20%. As a striking result, we show that such a mapping enables to predict the evolution of the critical point of the IQP model with the fraction of inhibitory neurons. In a second part, we bridge the gap between the description of bursts in the framework of percolation and the temporal description of neural networks activity by showing how dynamical simulations of bursts with an adaptive exponential integrate-and-fire model lead to a mean description of bursts activation which is captured by Quorum Percolation.

  7. Wheelchair accessibility to public buildings in Istanbul.

    PubMed

    Evcil, A Nilay

    2009-03-01

    Accessibility to public environment is the human right and basic need of each citizen and is one of the fundamental considerations for urban planning. The aim of this study is to determine the compliance of public buildings in central business districts (CBD) of Istanbul, Turkey, to wheelchair accessibility to the guidelines of the instrument and identify architectural barriers faced by wheelchair users. This is a descriptive study of 26 public buildings in CBD of Istanbul. The instrument used is the adapted Useh, Moyo and Munyonga questionnaire to collect the data from direct observation and measurement. Descriptive statistics of simple percentages and means are used to explain the compliance to the guidelines of the instrument and wheelchair accessibility. The descriptive survey results indicate that wheelchair users experience many accessibility problems in public environment of the most urbanised city (cultural capital of Europe in 2010) in a developing country. It is found that the major architectural barrier is the public transportation items with the lowest mean compliance (25%). Beside this, the most compliant to the instrument is entrance to building items with 79% as mean percentage. It is also found that there is an intention to improve accessibility when building construction period is investigated. This article describes the example of the compliance of public buildings accessibility when the country has legislation, but lacking regulations about accessibility for the wheelchair users.

  8. Success rates of a skeletal anchorage system in orthodontics: A retrospective analysis.

    PubMed

    Lam, Raymond; Goonewardene, Mithran S; Allan, Brent P; Sugawara, Junji

    2018-01-01

    To evaluate the premise that skeletal anchorage with SAS miniplates are highly successful and predictable for a range of complex orthodontic movements. This retrospective cross-sectional analysis consisted of 421 bone plates placed by one clinician in 163 patients (95 female, 68 male, mean age 29.4 years ± 12.02). Simple descriptive statistics were performed for a wide range of malocclusions and desired movements to obtain success, complication, and failure rates. The success rate of skeletal anchorage system miniplates was 98.6%, where approximately 40% of cases experienced mild complications. The most common complication was soft tissue inflammation, which was amenable to focused oral hygiene and antiseptic rinses. Infection occurred in approximately 15% of patients where there was a statistically significant correlation with poor oral hygiene. The most common movements were distalization and intrusion of teeth. More than a third of the cases involved complex movements in more than one plane of space. The success rate of skeletal anchorage system miniplates is high and predictable for a wide range of complex orthodontic movements.

  9. Effective model approach to the dense state of QCD matter

    NASA Astrophysics Data System (ADS)

    Fukushima, Kenji

    2011-12-01

    The first-principle approach to the dense state of QCD matter, i.e. the lattice-QCD simulation at finite baryon density, is not under theoretical control for the moment. The effective model study based on QCD symmetries is a practical alternative. However the model parameters that are fixed by hadronic properties in the vacuum may have unknown dependence on the baryon chemical potential. We propose a new prescription to constrain the effective model parameters by the matching condition with the thermal Statistical Model. In the transitional region where thermal quantities blow up in the Statistical Model, deconfined quarks and gluons should smoothly take over the relevant degrees of freedom from hadrons and resonances. We use the Polyakov-loop coupled Nambu-Jona-Lasinio (PNJL) model as an effective description in the quark side and show how the matching condition is satisfied by a simple ansäatz on the Polyakov loop potential. Our results favor a phase diagram with the chiral phase transition located at slightly higher temperature than deconfinement which stays close to the chemical freeze-out points.

  10. Transport and Lagrangian Statistics in Rotating Stratified Turbulence

    NASA Astrophysics Data System (ADS)

    Rosenberg, D. L.

    2015-12-01

    Transport plays a crucial role in geophysical flows, both in theatmosphere and in the ocean. Transport in such flows is ultimatelycontrolled by small-scale turbulence, although the large scales arein geostrophic balance between pressure gradient, gravity and Coriolisforces. As a result of the seemingly random nature of the flow, singleparticles are dispersed by the flow and on time scales significantlylonger than the eddy turn-over time, they undergo a diffusive motionwhose diffusion coefficient is the integral of the velocity correlationfunction. On intermediate time scales, in homogeneous, isotropic turbuilence(HIT) the separation between particle pairs has been argued to grow withtime according to the Richardson law: <(Δ x)2(t)> ~ t3, with aproportionality constant that depends on the initial particleseparation. The description of the phenomena associated withthe dispersion of single particles, or of particle pairs, ultimatelyrests on relatively simple statistical properties of the flowvelocity transporting the particles, in particular on its temporalcorrelation function. In this work, we investigate particle dispersionin the anisotropic case of rotating stratified turbulence examining whetherthe dependence on initial particle separation differs from HIT,particularly in the presence of an inverse cascade.

  11. An accurate behavioral model for single-photon avalanche diode statistical performance simulation

    NASA Astrophysics Data System (ADS)

    Xu, Yue; Zhao, Tingchen; Li, Ding

    2018-01-01

    An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.

  12. Correlations between human mobility and social interaction reveal general activity patterns.

    PubMed

    Mollgaard, Anders; Lehmann, Sune; Mathiesen, Joachim

    2017-01-01

    A day in the life of a person involves a broad range of activities which are common across many people. Going beyond diurnal cycles, a central question is: to what extent do individuals act according to patterns shared across an entire population? Here we investigate the interplay between different activity types, namely communication, motion, and physical proximity by analyzing data collected from smartphones distributed among 638 individuals. We explore two central questions: Which underlying principles govern the formation of the activity patterns? Are the patterns specific to each individual or shared across the entire population? We find that statistics of the entire population allows us to successfully predict 71% of the activity and 85% of the inactivity involved in communication, mobility, and physical proximity. Surprisingly, individual level statistics only result in marginally better predictions, indicating that a majority of activity patterns are shared across our sample population. Finally, we predict short-term activity patterns using a generalized linear model, which suggests that a simple linear description might be sufficient to explain a wide range of actions, whether they be of social or of physical character.

  13. A Simple Illustration for the Need of Multiple Comparison Procedures

    ERIC Educational Resources Information Center

    Carter, Rickey E.

    2010-01-01

    Statistical adjustments to accommodate multiple comparisons are routinely covered in introductory statistical courses. The fundamental rationale for such adjustments, however, may not be readily understood. This article presents a simple illustration to help remedy this.

  14. Comparing and combining process-based crop models and statistical models with some implications for climate change

    NASA Astrophysics Data System (ADS)

    Roberts, Michael J.; Braun, Noah O.; Sinclair, Thomas R.; Lobell, David B.; Schlenker, Wolfram

    2017-09-01

    We compare predictions of a simple process-based crop model (Soltani and Sinclair 2012), a simple statistical model (Schlenker and Roberts 2009), and a combination of both models to actual maize yields on a large, representative sample of farmer-managed fields in the Corn Belt region of the United States. After statistical post-model calibration, the process model (Simple Simulation Model, or SSM) predicts actual outcomes slightly better than the statistical model, but the combined model performs significantly better than either model. The SSM, statistical model and combined model all show similar relationships with precipitation, while the SSM better accounts for temporal patterns of precipitation, vapor pressure deficit and solar radiation. The statistical and combined models show a more negative impact associated with extreme heat for which the process model does not account. Due to the extreme heat effect, predicted impacts under uniform climate change scenarios are considerably more severe for the statistical and combined models than for the process-based model.

  15. Bridging stylized facts in finance and data non-stationarities

    NASA Astrophysics Data System (ADS)

    Camargo, Sabrina; Duarte Queirós, Sílvio M.; Anteneodo, Celia

    2013-04-01

    Employing a recent technique which allows the representation of nonstationary data by means of a juxtaposition of locally stationary paths of different length, we introduce a comprehensive analysis of the key observables in a financial market: the trading volume and the price fluctuations. From the segmentation procedure we are able to introduce a quantitative description of statistical features of these two quantities, which are often named stylized facts, namely the tails of the distribution of trading volume and price fluctuations and a dynamics compatible with the U-shaped profile of the volume in a trading section and the slow decay of the autocorrelation function. The segmentation of the trading volume series provides evidence of slow evolution of the fluctuating parameters of each patch, pointing to the mixing scenario. Assuming that long-term features are the outcome of a statistical mixture of simple local forms, we test and compare different probability density functions to provide the long-term distribution of the trading volume, concluding that the log-normal gives the best agreement with the empirical distribution. Moreover, the segmentation of the magnitude price fluctuations are quite different from the results for the trading volume, indicating that changes in the statistics of price fluctuations occur at a faster scale than in the case of trading volume.

  16. Statistics in three biomedical journals.

    PubMed

    Pilcík, T

    2003-01-01

    In this paper we analyze the use of statistics and associated problems, in three Czech biological journals in the year 2000. We investigated 23 articles Folia Biologica, 60 articles in Folia Microbiologica, and 88 articles in Physiological Research. The highest frequency of publications with statistical content have used descriptive statistics and t-test. The most usual mistake concerns the absence of reference about the used statistical software and insufficient description of the data. We have compared our results with the results of similar studies in some other medical journals. The use of important statistical methods is comparable with those used in most medical journals, the proportion of articles, in which the applied method is described insufficiently is moderately low.

  17. Unlawful Discrimination DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    Included is a review of the 4.0 description and items, followed by the proposed modifications to the factor. The current DEOCS (4.0) contains multiple...Officer (E7 – E9) 586 10.8% Junior Officer (O1 – O3) 474 9% Senior Officer (O4 and above) 391 6.1% Descriptive Statistics and Reliability This section...displays descriptive statistics for the items on the Unlawful Discrimination scale. All items had a range from 1 to 7 (strongly disagree to strongly

  18. Loser! On the combined impact of emotional and person-descriptive word meanings in communicative situations.

    PubMed

    Rohr, Lana; Abdel Rahman, Rasha

    2018-07-01

    Humans have a unique capacity to induce intense emotional states in others by simple acts of verbal communication, and simple messages such as bad can elicit strong emotions in the addressee. However, up to now, research has mainly focused on general emotional meaning aspects and paradigms of low personal relevance (e.g., word reading), thereby possibly underestimating the impact of verbal emotion. In the present study, we recorded ERPs while presenting emotional words differing in word-inherent person descriptiveness (in that they may or may not refer to or describe a person; e.g., winner vs. sunflower). We predicted stronger emotional responses to person-descriptive words. Additionally, we enhanced the relevance of the words by embedding them in social-communicative contexts. We observed strong parallels in the characteristics of emotion and descriptiveness effects, suggesting a common underlying motivational basis. Furthermore, word-inherent person descriptiveness affected emotion processing at late elaborate stages reflected in the late positive potential, with emotion effects found only for descriptive words. The present findings underline the importance of factors determining the personal relevance of emotional words. © 2018 Society for Psychophysiological Research.

  19. Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs

    ERIC Educational Resources Information Center

    Carr, Nathan T.

    2008-01-01

    Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…

  20. Self-Esteem and Academic Achievement of High School Students

    ERIC Educational Resources Information Center

    Moradi Sheykhjan, Tohid; Jabari, Kamran; Rajeswari, K.

    2014-01-01

    The primary purpose of this study was to determine the influence of self-esteem on academic achievement among high school students in Miandoab City of Iran. The methodology of the research is descriptive and correlation that descriptive and inferential statistics were used to analyze the data. Statistical Society includes male and female high…

  1. Type of body fat distribution in postmenopausal women and its related factors.

    PubMed

    Noroozi, Mahnaz; Rastegari, Zahra; Paknahad, Zamzam

    2010-01-01

    The type of body fat distribution has an important role for identifying risk of diseases. One of the simple anthropometric indexes for estimating type of body fat distribution is waist circumference index. This study is aimed to determine the type of body fat distribution in postmenopausal women and its related factors. This is a cross sectional descriptive analytical study. Samples were 278 postmenopausal women in Isfahan who were selected by stratified sampling and then were invited to 64 health centers of Isfahan. Data was gathered using a questionnaire and standard meter. Data was analyzed using SPSS software and descriptive and inferential statistics. Results showed that in postmenopausal women the mean of waist circumference index was 93.63 (10.66) and its range was 54 to 119 cm. There was a meaningful relation between job, educational status, total pregnancies, total deliveries, age of first pregnancy, lactation history and menopausal age with waist circumference index. Results showed that the type of body fat distribution of postmenopausal women is of android type. Considering side effects of this kind of distribution, necessary teachings about healthy eating, movement and exercises must be given to women of these ages.

  2. Failure of Breit-Wigner and success of dispersive descriptions of the τ- → K-ηντ decays

    NASA Astrophysics Data System (ADS)

    Roig, Pablo

    2015-11-01

    The τ- → K-ηντ decays have been studied using Chiral Perturbation Theory extended by including resonances as active fields. We have found that the treatment of final state interactions is crucial to provide a good description of the data. The Breit-Wigner approximation does not resum them and neglects the real part of the corresponding chiral loop functions, which violates analyticity and leads to a failure in the confrontation with the data. On the contrary, its resummation by means of an Omnes-like exponentiation of through a dispersive representation provides a successful explanation of the measurements. These results illustrate the fact that Breit-Wigner parametrizations of hadronic data, although simple and easy to handle, lack a link with the underlying strong interaction theory and should be avoided. As a result of our analysis we determine the properties of the K* (1410) resonance with a precision competitive to its traditional extraction using τ- → (Kπ)-ντ decays, albeit the much limited statistics accumulated for the τ- → K-ηντ channel. We also predict the soon discovery of the τ- → K-η'ντ decays.

  3. [A new model fo the evaluation of measurements of the neurocranium].

    PubMed

    Seidler, H; Wilfing, H; Weber, G; Traindl-Prohazka, M; zur Nedden, D; Platzer, W

    1993-12-01

    A simple and user-friendly model for trigonometric description of the neurocranium based on newly defined points of measurement is presented. This model not only provides individual description, but also allows for an evaluation of developmental and phylogenetic aspects.

  4. Satisfaction of active duty soldiers with family dental care.

    PubMed

    Chisick, M C

    1997-02-01

    In the fall of 1992, a random, worldwide sample of 6,442 married and single parent soldiers completed a self-administered survey on satisfaction with 22 attributes of family dental care. Simple descriptive statistics for each attribute were derived, as was a composite overall satisfaction score using factor analysis. Composite scores were regressed on demographics, annual dental utilization, and access barriers to identify those factors having an impact on a soldier's overall satisfaction with family dental care. Separate regression models were constructed for single parents, childless couples, and couples with children. Results show below-average satisfaction with nearly all attributes of family dental care, with access attributes having the lowest average satisfaction scores. Factors influencing satisfaction with family dental care varied by family type with one exception: dependent dental utilization within the past year contributed positively to satisfaction across all family types.

  5. Measurement of Muon Neutrino Quasielastic Scattering on Carbon

    NASA Astrophysics Data System (ADS)

    Aguilar-Arevalo, A. A.; Bazarko, A. O.; Brice, S. J.; Brown, B. C.; Bugel, L.; Cao, J.; Coney, L.; Conrad, J. M.; Cox, D. C.; Curioni, A.; Djurcic, Z.; Finley, D. A.; Fleming, B. T.; Ford, R.; Garcia, F. G.; Garvey, G. T.; Green, C.; Green, J. A.; Hart, T. L.; Hawker, E.; Imlay, R.; Johnson, R. A.; Kasper, P.; Katori, T.; Kobilarcik, T.; Kourbanis, I.; Koutsoliotas, S.; Laird, E. M.; Link, J. M.; Liu, Y.; Liu, Y.; Louis, W. C.; Mahn, K. B. M.; Marsh, W.; Martin, P. S.; McGregor, G.; Metcalf, W.; Meyers, P. D.; Mills, F.; Mills, G. B.; Monroe, J.; Moore, C. D.; Nelson, R. H.; Nienaber, P.; Ouedraogo, S.; Patterson, R. B.; Perevalov, D.; Polly, C. C.; Prebys, E.; Raaf, J. L.; Ray, H.; Roe, B. P.; Russell, A. D.; Sandberg, V.; Schirato, R.; Schmitz, D.; Shaevitz, M. H.; Shoemaker, F. C.; Smith, D.; Sorel, M.; Spentzouris, P.; Stancu, I.; Stefanski, R. J.; Sung, M.; Tanaka, H. A.; Tayloe, R.; Tzanov, M.; van de Water, R.; Wascko, M. O.; White, D. H.; Wilking, M. J.; Yang, H. J.; Zeller, G. P.; Zimmerman, E. D.

    2008-01-01

    The observation of neutrino oscillations is clear evidence for physics beyond the standard model. To make precise measurements of this phenomenon, neutrino oscillation experiments, including MiniBooNE, require an accurate description of neutrino charged current quasielastic (CCQE) cross sections to predict signal samples. Using a high-statistics sample of νμ CCQE events, MiniBooNE finds that a simple Fermi gas model, with appropriate adjustments, accurately characterizes the CCQE events observed in a carbon-based detector. The extracted parameters include an effective axial mass, MAeff=1.23±0.20GeV, that describes the four-momentum dependence of the axial-vector form factor of the nucleon, and a Pauli-suppression parameter, κ=1.019±0.011. Such a modified Fermi gas model may also be used by future accelerator-based experiments measuring neutrino oscillations on nuclear targets.

  6. Random walk, diffusion and mixing in simulations of scalar transport in fluid flows

    NASA Astrophysics Data System (ADS)

    Klimenko, A. Y.

    2008-12-01

    Physical similarity and mathematical equivalence of continuous diffusion and particle random walk form one of the cornerstones of modern physics and the theory of stochastic processes. In many applied models used in simulation of turbulent transport and turbulent combustion, mixing between particles is used to reflect the influence of the continuous diffusion terms in the transport equations. We show that the continuous scalar transport and diffusion can be accurately specified by means of mixing between randomly walking Lagrangian particles with scalar properties and assess errors associated with this scheme. This gives an alternative formulation for the stochastic process which is selected to represent the continuous diffusion. This paper focuses on statistical errors and deals with relatively simple cases, where one-particle distributions are sufficient for a complete description of the problem.

  7. Self-diffusion in periodic porous media: a comparison of numerical simulation and eigenvalue methods.

    PubMed

    Schwartz, L M; Bergman, D J; Dunn, K J; Mitra, P P

    1996-01-01

    Random walk computer simulations are an important tool in understanding magnetic resonance measurements in porous media. In this paper we focus on the description of pulsed field gradient spin echo (PGSE) experiments that measure the probability, P(R,t), that a diffusing water molecule will travel a distance R in a time t. Because PGSE simulations are often limited by statistical considerations, we will see that valuable insight can be gained by working with simple periodic geometries and comparing simulation data to the results of exact eigenvalue expansions. In this connection, our attention will be focused on (1) the wavevector, k, and time dependent magnetization, M(k, t); and (2) the normalized probability, Ps(delta R, t), that a diffusing particle will return to within delta R of the origin after time t.

  8. The Value of Data and Metadata Standardization for Interoperability in Giovanni Or: Why Your Product's Metadata Causes Us Headaches!

    NASA Technical Reports Server (NTRS)

    Smit, Christine; Hegde, Mahabaleshwara; Strub, Richard; Bryant, Keith; Li, Angela; Petrenko, Maksym

    2017-01-01

    Giovanni is a data exploration and visualization tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC). It has been around in one form or another for more than 15 years. Giovanni calculates simple statistics and produces 22 different visualizations for more than 1600 geophysical parameters from more than 90 satellite and model products. Giovanni relies on external data format standards to ensure interoperability, including the NetCDF CF Metadata Conventions. Unfortunately, these standards were insufficient to make Giovanni's internal data representation truly simple to use. Finding and working with dimensions can be convoluted with the CF Conventions. Furthermore, the CF Conventions are silent on machine-friendly descriptive metadata such as the parameter's source product and product version. In order to simplify analyzing disparate earth science data parameters in a unified way, we developed Giovanni's internal standard. First, the format standardizes parameter dimensions and variables so they can be easily found. Second, the format adds all the machine-friendly metadata Giovanni needs to present our parameters to users in a consistent and clear manner. At a glance, users can grasp all the pertinent information about parameters both during parameter selection and after visualization.

  9. Random Evolution of Idiotypic Networks: Dynamics and Architecture

    NASA Astrophysics Data System (ADS)

    Brede, Markus; Behn, Ulrich

    The paper deals with modelling a subsystem of the immune system, the so-called idiotypic network (INW). INWs, conceived by N.K. Jerne in 1974, are functional networks of interacting antibodies and B cells. In principle, Jernes' framework provides solutions to many issues in immunology, such as immunological memory, mechanisms for antigen recognition and self/non-self discrimination. Explaining the interconnection between the elementary components, local dynamics, network formation and architecture, and possible modes of global system function appears to be an ideal playground of statistical mechanics. We present a simple cellular automaton model, based on a graph representation of the system. From a simplified description of idiotypic interactions, rules for the random evolution of networks of occupied and empty sites on these graphs are derived. In certain biologically relevant parameter ranges the resultant dynamics leads to stationary states. A stationary state is found to correspond to a specific pattern of network organization. It turns out that even these very simple rules give rise to a multitude of different kinds of patterns. We characterize these networks by classifying `static' and `dynamic' network-patterns. A type of `dynamic' network is found to display many features of real INWs.

  10. Simple model for multiple-choice collective decision making

    NASA Astrophysics Data System (ADS)

    Lee, Ching Hua; Lucas, Andrew

    2014-11-01

    We describe a simple model of heterogeneous, interacting agents making decisions between n ≥2 discrete choices. For a special class of interactions, our model is the mean field description of random field Potts-like models and is effectively solved by finding the extrema of the average energy E per agent. In these cases, by studying the propagation of decision changes via avalanches, we argue that macroscopic dynamics is well captured by a gradient flow along E . We focus on the permutation symmetric case, where all n choices are (on average) the same, and spontaneous symmetry breaking (SSB) arises purely from cooperative social interactions. As examples, we show that bimodal heterogeneity naturally provides a mechanism for the spontaneous formation of hierarchies between decisions and that SSB is a preferred instability to discontinuous phase transitions between two symmetric points. Beyond the mean field limit, exponentially many stable equilibria emerge when we place this model on a graph of finite mean degree. We conclude with speculation on decision making with persistent collective oscillations. Throughout the paper, we emphasize analogies between methods of solution to our model and common intuition from diverse areas of physics, including statistical physics and electromagnetism.

  11. Statistical description and transport in stochastic magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vanden Eijnden, E.; Balescu, R.

    1996-03-01

    The statistical description of particle motion in a stochastic magnetic field is presented. Starting form the stochastic Liouville equation (or, hybrid kinetic equation) associated with the equations of motion of a test particle, the probability distribution function of the system is obtained for various magnetic fields and collisional processes. The influence of these two ingredients on the statistics of the particle dynamics is stressed. In all cases, transport properties of the system are discussed. {copyright} {ital 1996 American Institute of Physics.}

  12. Stress among Academic Staff and Students' Satisfaction of Their Performances in Payame Noor University of Miandoab

    ERIC Educational Resources Information Center

    Jabari, Kamran; Moradi Sheykhjan, Tohid

    2015-01-01

    Present study examined the relationship between stress among academic staff and students' satisfaction of their performances in Payame Noor University (PNU) of Miandoab City, Iran in 2014. The methodology of the research is descriptive and correlation that descriptive and inferential statistics were used to analyze the data. Statistical Society…

  13. Quantitative Methods in Library and Information Science Literature: Descriptive vs. Inferential Statistics.

    ERIC Educational Resources Information Center

    Brattin, Barbara C.

    Content analysis was performed on the top six core journals for 1990 in library and information science to determine the extent of research in the field. Articles (n=186) were examined for descriptive or inferential statistics and separately for the presence of mathematical models. Results show a marked (14%) increase in research for 1990,…

  14. Statistical complexity without explicit reference to underlying probabilities

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.

    2018-06-01

    We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.

  15. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. [Study of quality of life in adults with common variable immunodeficiency by using the Questionnaire SF-36].

    PubMed

    López-Pérez, Patricia; Miranda-Novales, Guadalupe; Segura-Méndez, Nora Hilda; Del Rivero-Hernández, Leonel; Cambray-Gutiérrez, Cesar; Chávez-García, Aurora

    2014-01-01

    Quality of life is a multidimensional concept that includes physical, emotional and social components associated with the disease. The use of tools to assess the Quality of Life Health Related (HRQOL) has increased in recent decades. Common variable immunodeficiency (CVID) is the most commonly diagnosed primary immunodeficiency. To evaluate the quality of life in patients with CVID using the questionnaire SF -36. A descriptive cross-sectional survey included 23 patients diagnosed with CVID, belonging to the Immunodeficiency Clinic Service of Allergology and Clinical Immunology in CMN Siglo XXI, IMSS. The questionnaire SF- 36 validated in Spanish was applied. descriptive statistics with simple frequencies and percentages, inferential statistics: Fisher exact test and ANOVA to compare means. The study involved 23 patients, 14 women (60%) and 9 men (40%), mean age 38.6 ± 14.7 years. The highest score was obtained in 83% emotional role. Dimensions with further deterioration in both genders were: 54% general health, vitality 59% and physical performance 72%. No differences were found regarding gender. The only issue in which statistically significant differences were found in patients with more than 3 comorbidities was change in health status in the past year (p=0.007). Patients with severe comorbidities, such as haematologicaloncological (leukemias, lymphomas, neoplasms), and pulmonary (severe bronchiectasis) showed further deterioration in the aspects of physical performance 73% and 64% emotional role. 65% of patients reported an improvement in health status in 74% in the last year. Adult patients with CVID show deterioration in different dimensions, particularly in the areas of general health, vitality and physical performance. Patients with severe comorbidities such as leukemia, lymphomas, malignancies and severe bronchiectasis show further deterioration in some aspects of quality of life, especially in physical performance and emotional role. A higher number of comorbidities was significantly associated with a lower score in changing health. The questionnaire SF-36 is useful for evaluating the quality of life of our patients with CVID.

  17. Revisiting liver anatomy and terminology of hepatectomies.

    PubMed

    Bismuth, Henri

    2013-03-01

    Since the development of liver surgery, several descriptions of liver anatomical division and hepatectomies have been made, causing some confusion among surgeons. The initial anatomical description according to Couinaud is reviewed and corrected taking into account the descriptions made in the following decades. It seems that by reviewing the description of the different authors, a precise anatomical division of the liver may be achieved and a simple terminology of hepatectomies may be proposed. It is hoped that the proposal of this anatomical description and this terminology of hepatectomies may find a consensus among the liver surgical community from America, Asia, and Europe.

  18. 75 FR 4323 - Additional Quantitative Fit-testing Protocols for the Respiratory Protection Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... respirators (500 and 1000 for protocols 1 and 2, respectively). However, OSHA could not evaluate the results... the values of these descriptive statistics for revised PortaCount[supreg] QNFT protocols 1 (at RFFs of 100 and 500) and 2 (at RFFs of 200 and 1000). Table 2--Descriptive Statistics for RFFs of 100 and 200...

  19. Exploring Marine Corps Officer Quality: An Analysis of Promotion to Lieutenant Colonel

    DTIC Science & Technology

    2017-03-01

    44 G. DESCRIPTIVE STATISTICS ................................................................44 1. Dependent...Variable Summary Statistics ...................................44 2. Performance...87 4. Further Research .........................................................................88 APPENDIX A. SUMMARY STATISTICS OF FITREP AND

  20. Towards system-wide implementation of the International Classification of Functioning, Disability and Health (ICF) in routine practice: Developing simple, intuitive descriptions of ICF categories in the ICF Generic and Rehabilitation Set.

    PubMed

    Prodinger, Birgit; Reinhardt, Jan D; Selb, Melissa; Stucki, Gerold; Yan, Tiebin; Zhang, Xia; Li, Jianan

    2016-06-13

    A national, multi-phase, consensus process to develop simple, intuitive descriptions of International Classification of Functioning, Disability and Health (ICF) categories contained in the ICF Generic and Rehabilitation Sets, with the aim of enhancing the utility of the ICF in routine clinical practice, is presented in this study. A multi-stage, national, consensus process was conducted. The consensus process involved 3 expert groups and consisted of a preparatory phase, a consensus conference with consecutive working groups and 3 voting rounds (votes A, B and C), followed by an implementation phase. In the consensus conference, participants first voted on whether they agreed that an initially developed proposal for simple, intuitive descriptions of an ICF category was in fact simple and intuitive. The consensus conference was held in August 2014 in mainland China. Twenty-one people with a background in physical medicine and rehabilitation participated in the consensus process. Four ICF categories achieved consensus in vote A, 16 in vote B, and 8 in vote C. This process can be seen as part of a larger effort towards the system-wide implementation of the ICF in routine clinical and rehabilitation practice to allow for the regular and comprehensive evaluation of health outcomes most relevant for the monitoring of quality of care.

  1. A simple rain attenuation model for earth-space radio links operating at 10-35 GHz

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Yon, K. M.

    1986-01-01

    The simple attenuation model has been improved from an earlier version and now includes the effect of wave polarization. The model is for the prediction of rain attenuation statistics on earth-space communication links operating in the 10-35 GHz band. Simple calculations produce attenuation values as a function of average rain rate. These together with rain rate statistics (either measured or predicted) can be used to predict annual rain attenuation statistics. In this paper model predictions are compared to measured data from a data base of 62 experiments performed in the U.S., Europe, and Japan. Comparisons are also made to predictions from other models.

  2. A Multidisciplinary Approach for Teaching Statistics and Probability

    ERIC Educational Resources Information Center

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  3. The Performance and Retention of Female Navy Officers with a Military Spouse

    DTIC Science & Technology

    2017-03-01

    5 2. Female Officer Retention and Dual-Military Couples ...............7 3. Demographic Statistics ...23 III. DATA DESCRIPTION AND STATISTICS ...28 2. Independent Variables.................................................................31 C. SUMMARY STATISTICS

  4. Maximum Likelihood Time-of-Arrival Estimation of Optical Pulses via Photon-Counting Photodetectors

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.; Moision, Bruce E.

    2010-01-01

    Many optical imaging, ranging, and communications systems rely on the estimation of the arrival time of an optical pulse. Recently, such systems have been increasingly employing photon-counting photodetector technology, which changes the statistics of the observed photocurrent. This requires time-of-arrival estimators to be developed and their performances characterized. The statistics of the output of an ideal photodetector, which are well modeled as a Poisson point process, were considered. An analytical model was developed for the mean-square error of the maximum likelihood (ML) estimator, demonstrating two phenomena that cause deviations from the minimum achievable error at low signal power. An approximation was derived to the threshold at which the ML estimator essentially fails to provide better than a random guess of the pulse arrival time. Comparing the analytic model performance predictions to those obtained via simulations, it was verified that the model accurately predicts the ML performance over all regimes considered. There is little prior art that attempts to understand the fundamental limitations to time-of-arrival estimation from Poisson statistics. This work establishes both a simple mathematical description of the error behavior, and the associated physical processes that yield this behavior. Previous work on mean-square error characterization for ML estimators has predominantly focused on additive Gaussian noise. This work demonstrates that the discrete nature of the Poisson noise process leads to a distinctly different error behavior.

  5. Statistical Paradigm for Organic Optoelectronic Devices: Normal Force Testing for Adhesion of Organic Photovoltaics and Organic Light-Emitting Diodes.

    PubMed

    Vasilak, Lindsay; Tanu Halim, Silvie M; Das Gupta, Hrishikesh; Yang, Juan; Kamperman, Marleen; Turak, Ayse

    2017-04-19

    In this study, we assess the utility of a normal force (pull-test) approach to measuring adhesion in organic solar cells and organic light-emitting diodes. This approach is a simple and practical method of monitoring the impact of systematic changes in materials, processing conditions, or environmental exposure on interfacial strength and electrode delamination. The ease of measurement enables a statistical description with numerous samples, variant geometry, and minimal preparation. After examining over 70 samples, using the Weibull modulus and the characteristic breaking strength as metrics, we were able to successfully differentiate the adhesion values between 8-tris(hydroxyquinoline aluminum) (Alq 3 ) and poly(3-hexyl-thiophene) and [6,6]-phenyl C61-butyric acid methyl ester (P3HT:PCBM) interfaces with Al and between two annealing times for the bulk heterojunction polymer blends. Additionally, the Weibull modulus, a relative measure of the range of flaw sizes at the fracture plane, can be correlated with the roughness of the organic surface. Finite element modeling of the delamination process suggests that the out-of-plane elastic modulus for Alq 3 is lower than the reported in-plane elastic values. We suggest a statistical treatment of a large volume of tests be part of the standard protocol for investigating adhesion to accommodate the unavoidable variability in morphology and interfacial structure found in most organic devices.

  6. Formalized Conflicts Detection Based on the Analysis of Multiple Emails: An Approach Combining Statistics and Ontologies

    NASA Astrophysics Data System (ADS)

    Zakaria, Chahnez; Curé, Olivier; Salzano, Gabriella; Smaïli, Kamel

    In Computer Supported Cooperative Work (CSCW), it is crucial for project leaders to detect conflicting situations as early as possible. Generally, this task is performed manually by studying a set of documents exchanged between team members. In this paper, we propose a full-fledged automatic solution that identifies documents, subjects and actors involved in relational conflicts. Our approach detects conflicts in emails, probably the most popular type of documents in CSCW, but the methods used can handle other text-based documents. These methods rely on the combination of statistical and ontological operations. The proposed solution is decomposed in several steps: (i) we enrich a simple negative emotion ontology with terms occuring in the corpus of emails, (ii) we categorize each conflicting email according to the concepts of this ontology and (iii) we identify emails, subjects and team members involved in conflicting emails using possibilistic description logic and a set of proposed measures. Each of these steps are evaluated and validated on concrete examples. Moreover, this approach's framework is generic and can be easily adapted to domains other than conflicts, e.g. security issues, and extended with operations making use of our proposed set of measures.

  7. PhyloExplorer: a web server to validate, explore and query phylogenetic trees

    PubMed Central

    Ranwez, Vincent; Clairon, Nicolas; Delsuc, Frédéric; Pourali, Saeed; Auberval, Nicolas; Diser, Sorel; Berry, Vincent

    2009-01-01

    Background Many important problems in evolutionary biology require molecular phylogenies to be reconstructed. Phylogenetic trees must then be manipulated for subsequent inclusion in publications or analyses such as supertree inference and tree comparisons. However, no tool is currently available to facilitate the management of tree collections providing, for instance: standardisation of taxon names among trees with respect to a reference taxonomy; selection of relevant subsets of trees or sub-trees according to a taxonomic query; or simply computation of descriptive statistics on the collection. Moreover, although several databases of phylogenetic trees exist, there is currently no easy way to find trees that are both relevant and complementary to a given collection of trees. Results We propose a tool to facilitate assessment and management of phylogenetic tree collections. Given an input collection of rooted trees, PhyloExplorer provides facilities for obtaining statistics describing the collection, correcting invalid taxon names, extracting taxonomically relevant parts of the collection using a dedicated query language, and identifying related trees in the TreeBASE database. Conclusion PhyloExplorer is a simple and interactive website implemented through underlying Python libraries and MySQL databases. It is available at: and the source code can be downloaded from: . PMID:19450253

  8. The role of shape complexity in the detection of closed contours.

    PubMed

    Wilder, John; Feldman, Jacob; Singh, Manish

    2016-09-01

    The detection of contours in noise has been extensively studied, but the detection of closed contours, such as the boundaries of whole objects, has received relatively little attention. Closed contours pose substantial challenges not present in the simple (open) case, because they form the outlines of whole shapes and thus take on a range of potentially important configural properties. In this paper we consider the detection of closed contours in noise as a probabilistic decision problem. Previous work on open contours suggests that contour complexity, quantified as the negative log probability (Description Length, DL) of the contour under a suitably chosen statistical model, impairs contour detectability; more complex (statistically surprising) contours are harder to detect. In this study we extended this result to closed contours, developing a suitable probabilistic model of whole shapes that gives rise to several distinct though interrelated measures of shape complexity. We asked subjects to detect either natural shapes (Exp. 1) or experimentally manipulated shapes (Exp. 2) embedded in noise fields. We found systematic effects of global shape complexity on detection performance, demonstrating how aspects of global shape and form influence the basic process of object detection. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Characteristic effects of stochastic oscillatory forcing on neural firing: analytical theory and comparison to paddlefish electroreceptor data.

    PubMed

    Bauermeister, Christoph; Schwalger, Tilo; Russell, David F; Neiman, Alexander B; Lindner, Benjamin

    2013-01-01

    Stochastic signals with pronounced oscillatory components are frequently encountered in neural systems. Input currents to a neuron in the form of stochastic oscillations could be of exogenous origin, e.g. sensory input or synaptic input from a network rhythm. They shape spike firing statistics in a characteristic way, which we explore theoretically in this report. We consider a perfect integrate-and-fire neuron that is stimulated by a constant base current (to drive regular spontaneous firing), along with Gaussian narrow-band noise (a simple example of stochastic oscillations), and a broadband noise. We derive expressions for the nth-order interval distribution, its variance, and the serial correlation coefficients of the interspike intervals (ISIs) and confirm these analytical results by computer simulations. The theory is then applied to experimental data from electroreceptors of paddlefish, which have two distinct types of internal noisy oscillators, one forcing the other. The theory provides an analytical description of their afferent spiking statistics during spontaneous firing, and replicates a pronounced dependence of ISI serial correlation coefficients on the relative frequency of the driving oscillations, and furthermore allows extraction of certain parameters of the intrinsic oscillators embedded in these electroreceptors.

  10. Quantum weak turbulence with applications to semiconductor lasers

    NASA Astrophysics Data System (ADS)

    Lvov, Yuri Victorovich

    Based on a model Hamiltonian appropriate for the description of fermionic systems such as semiconductor lasers, we describe a natural asymptotic closure of the BBGKY hierarchy in complete analogy with that derived for classical weak turbulence. The main features of the interaction Hamiltonian are the inclusion of full Fermi statistics containing Pauli blocking and a simple, phenomenological, uniformly weak two particle interaction potential equivalent to the static screening approximation. The resulting asymytotic closure and quantum kinetic Boltzmann equation are derived in a self consistent manner without resorting to a priori statistical hypotheses or cumulant discard assumptions. We find a new class of solutions to the quantum kinetic equation which are analogous to the Kolmogorov spectra of hydrodynamics and classical weak turbulence. They involve finite fluxes of particles and energy across momentum space and are particularly relevant for describing the behavior of systems containing sources and sinks. We explore these solutions by using differential approximation to collision integral. We make a prima facie case that these finite flux solutions can be important in the context of semiconductor lasers. We show that semiconductor laser output efficiency can be improved by exciting these finite flux solutions. Numerical simulations of the semiconductor Maxwell Bloch equations support the claim.

  11. What is expected from a facial trauma caused by violence?

    PubMed

    Goulart, Douglas Rangel; Colombo, Lucas do Amaral; de Moraes, Márcio; Asprino, Luciana

    2014-01-01

    The aim of this retrospective study was to compare the peculiarities of maxillofacial injuries caused by interpersonal violence with other etiologic factors. Medical records of 3,724 patients with maxillofacial injuries in São Paulo state (Brazil) were retrospectively analyzed. The data were submitted to statistical analysis (simple descriptive statistics and Chi-squared test) using SPSS 18.0 software. Data of 612 patients with facial injuries caused by violence were analyzed. The majority of the patients were male (81%; n = 496), with a mean age of 31.28 years (standard deviation of 13.33 years). These patients were more affected by mandibular and nose fractures, when compared with all other patients (P < 0.01), although fewer injuries were recorded in other body parts (χ(2) = 17.54; P < 0.01); Victims of interpersonal violence exhibited more injuries when the neurocranium was analyzed in isolation (χ(2) = 6.85; P < 0.01). Facial trauma due to interpersonal violence seem to be related to a higher rate of facial fractures and lacerations when compared to all patients with facial injuries. Prominent areas of the face and neurocranium were more affected by injuries.

  12. Impact of waste disposal on health of a poor urban community in Zimbambwe.

    PubMed

    Makoni, F S; Ndamba, J; Mbati, P A; Manase, G

    2004-08-01

    To assess excreta and waste disposal facilities available and their impact on sanitation related diseases in Epworth, an informal settlement on the outskirts of Harare. Descriptive cross-sectional survey. This was a community based study of Epworth informal settlement. A total of 308 households were interviewed. Participating households were randomly selected from the three communities of Epworth. Secondary medical archival data on diarrhoeal disease prevalence was collected from local clinics and district health offices in the study areas. Only 7% of households were connected to the sewer system. The study revealed that in Zinyengere extension 13% had no toilet facilities, 48% had simple pits and 37% had Blair VIP latrines. In Overspill 2% had no toilet facilities, 28% had simple latrines and 36% had Blair VIP latrines while in New Gada 20% had no toilet facilities, 24% had simple pits and 23% had Blair VIP latrines. Although a significant percentage had latrines (83.2%), over 50% of the population were not satisfied with the toilet facilities they were using. All the respondents expressed dissatisfaction with their domestic waste disposal practices with 46.6% admitting to have indiscriminately dumped waste. According to the community, diarrhoeal diseases were the most prevalent diseases (50%) related to poor sanitation. Health statistics also indicated that diarrhoea was a major problem in this community. It is recommended that households and the local authorities concentrate on improving the provision of toilets, water and waste disposal facilities as a way of improving the health state of the community.

  13. Back to basics: an introduction to statistics.

    PubMed

    Halfens, R J G; Meijers, J M M

    2013-05-01

    In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.

  14. Students' attitudes towards learning statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Hamid, Mohd Rashid Ab; Zakaria, Roslinazairimah

    2015-05-01

    Positive attitude towards learning is vital in order to master the core content of the subject matters under study. This is unexceptional in learning statistics course especially at the university level. Therefore, this study investigates the students' attitude towards learning statistics. Six variables or constructs have been identified such as affect, cognitive competence, value, difficulty, interest, and effort. The instrument used for the study is questionnaire that was adopted and adapted from the reliable instrument of Survey of Attitudes towards Statistics(SATS©). This study is conducted to engineering undergraduate students in one of the university in the East Coast of Malaysia. The respondents consist of students who were taking the applied statistics course from different faculties. The results are analysed in terms of descriptive analysis and it contributes to the descriptive understanding of students' attitude towards the teaching and learning process of statistics.

  15. Aggregation and folding phase transitions of RNA molecules

    NASA Astrophysics Data System (ADS)

    Bundschuh, Ralf

    2007-03-01

    RNA is a biomolecule that is involved in nearly all aspects of cellular functions. In order to perform many of these functions, RNA molecules have to fold into specific secondary structures. This folding is driven by the tendency of the bases to form Watson-Crick base pairs. Beyond the biological importance of RNA, the relatively simple rules for structure formation of RNA make it a very interesting system from the statistical physics point of view. We will present examples of phase transitions in RNA secondary structure formation that are amenable to analytical descriptions. A special focus will be on aggregation between several RNA molecules which is important for some regulatory circuits based on RNA structure, triplet repeat diseases like Huntington's, and as a model for prion diseases. We show that depending on the relative strength of the intramolecular and the intermolecular base pairing, RNA molecules undergo a transition into an aggregated phase and quantitatively characterize this transition.

  16. Event-Ready Bell Test Using Entangled Atoms Simultaneously Closing Detection and Locality Loopholes

    NASA Astrophysics Data System (ADS)

    Rosenfeld, Wenjamin; Burchardt, Daniel; Garthoff, Robert; Redeker, Kai; Ortegel, Norbert; Rau, Markus; Weinfurter, Harald

    2017-07-01

    An experimental test of Bell's inequality allows ruling out any local-realistic description of nature by measuring correlations between distant systems. While such tests are conceptually simple, there are strict requirements concerning the detection efficiency of the involved measurements, as well as the enforcement of spacelike separation between the measurement events. Only very recently could both loopholes be closed simultaneously. Here we present a statistically significant, event-ready Bell test based on combining heralded entanglement of atoms separated by 398 m with fast and efficient measurements of the atomic spin states closing essential loopholes. We obtain a violation with S =2.221 ±0.033 (compared to the maximal value of 2 achievable with models based on local hidden variables) which allows us to refute the hypothesis of local realism with a significance level P <2.57 ×10-9.

  17. SNDR enhancement in noisy sinusoidal signals by non-linear processing elements

    NASA Astrophysics Data System (ADS)

    Martorell, Ferran; McDonnell, Mark D.; Abbott, Derek; Rubio, Antonio

    2007-06-01

    We investigate the possibility of building linear amplifiers capable of enhancing the Signal-to-Noise and Distortion Ratio (SNDR) of sinusoidal input signals using simple non-linear elements. Other works have proven that it is possible to enhance the Signal-to-Noise Ratio (SNR) by using limiters. In this work we study a soft limiter non-linear element with and without hysteresis. We show that the SNDR of sinusoidal signals can be enhanced by 0.94 dB using a wideband soft limiter and up to 9.68 dB using a wideband soft limiter with hysteresis. These results indicate that linear amplifiers could be constructed using non-linear circuits with hysteresis. This paper presents mathematical descriptions for the non-linear elements using statistical parameters. Using these models, the input-output SNDR enhancement is obtained by optimizing the non-linear transfer function parameters to maximize the output SNDR.

  18. Quantum weak turbulence with applications to semiconductor lasers

    NASA Astrophysics Data System (ADS)

    Lvov, Y. V.; Binder, R.; Newell, A. C.

    1998-10-01

    Based on a model Hamiltonian appropriate for the description of fermionic systems such as semiconductor lasers, we describe a natural asymptotic closure of the BBGKY hierarchy in complete analogy with that derived for classical weak turbulence. The main features of the interaction Hamiltonian are the inclusion of full Fermi statistics containing Pauli blocking and a simple, phenomenological, uniformly weak two-particle interaction potential equivalent to the static screening approximation. We find a new class of solutions to the quantum kinetic equation which are analogous to the Kolmogorov spectra of hydrodynamics and classical weak turbulence. They involve finite fluxes of particles and energy in momentum space and are particularly relevant for describing the behavior of systems containing sources and sinks. We make a prima facie case that these finite flux solutions can be important in the context of semiconductor lasers and show how they might be used to enhance laser performance.

  19. Kinetic field theory: exact free evolution of Gaussian phase-space correlations

    NASA Astrophysics Data System (ADS)

    Fabis, Felix; Kozlikin, Elena; Lilow, Robert; Bartelmann, Matthias

    2018-04-01

    In recent work we developed a description of cosmic large-scale structure formation in terms of non-equilibrium ensembles of classical particles, with time evolution obtained in the framework of a statistical field theory. In these works, the initial correlations between particles sampled from random Gaussian density and velocity fields have so far been treated perturbatively or restricted to pure momentum correlations. Here we treat the correlations between all phase-space coordinates exactly by adopting a diagrammatic language for the different forms of correlations, directly inspired by the Mayer cluster expansion. We will demonstrate that explicit expressions for phase-space density cumulants of arbitrary n-point order, which fully capture the non-linear coupling of free streaming kinematics due to initial correlations, can be obtained from a simple set of Feynman rules. These cumulants will be the foundation for future investigations of perturbation theory in particle interactions.

  20. Refinement of the probability density function model for preferential concentration of aerosol particles in isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Zaichik, Leonid I.; Alipchenkov, Vladimir M.

    2007-11-01

    The purposes of the paper are threefold: (i) to refine the statistical model of preferential particle concentration in isotropic turbulence that was previously proposed by Zaichik and Alipchenkov [Phys. Fluids 15, 1776 (2003)], (ii) to investigate the effect of clustering of low-inertia particles using the refined model, and (iii) to advance a simple model for predicting the collision rate of aerosol particles. The model developed is based on a kinetic equation for the two-point probability density function of the relative velocity distribution of particle pairs. Improvements in predicting the preferential concentration of low-inertia particles are attained due to refining the description of the turbulent velocity field of the carrier fluid by including a difference between the time scales of the of strain and rotation rate correlations. The refined model results in a better agreement with direct numerical simulations for aerosol particles.

  1. Ion specific correlations in bulk and at biointerfaces.

    PubMed

    Kalcher, I; Horinek, D; Netz, R R; Dzubiella, J

    2009-10-21

    Ion specific effects are ubiquitous in any complex colloidal or biological fluid in bulk or at interfaces. The molecular origins of these 'Hofmeister effects' are not well understood and their theoretical description poses a formidable challenge to the modeling and simulation community. On the basis of the combination of atomistically resolved molecular dynamics (MD) computer simulations and statistical mechanics approaches, we present a few selected examples of specific electrolyte effects in bulk, at simple neutral and charged interfaces, and on a short α-helical peptide. The structural complexity in these strongly Coulomb-correlated systems is highlighted and analyzed in the light of available experimental data. While in general the comparison of MD simulations to experiments often lacks quantitative agreement, mostly because molecular force fields and coarse-graining procedures remain to be optimized, the consensus as regards trends provides important insights into microscopic hydration and binding mechanisms.

  2. Student's Conceptions in Statistical Graph's Interpretation

    ERIC Educational Resources Information Center

    Kukliansky, Ida

    2016-01-01

    Histograms, box plots and cumulative distribution graphs are popular graphic representations for statistical distributions. The main research question that this study focuses on is how college students deal with interpretation of these statistical graphs when translating graphical representations into analytical concepts in descriptive statistics.…

  3. The Feasibility of Real-Time Intraoperative Performance Assessment With SIMPL (System for Improving and Measuring Procedural Learning): Early Experience From a Multi-institutional Trial.

    PubMed

    Bohnen, Jordan D; George, Brian C; Williams, Reed G; Schuller, Mary C; DaRosa, Debra A; Torbeck, Laura; Mullen, John T; Meyerson, Shari L; Auyang, Edward D; Chipman, Jeffrey G; Choi, Jennifer N; Choti, Michael A; Endean, Eric D; Foley, Eugene F; Mandell, Samuel P; Meier, Andreas H; Smink, Douglas S; Terhune, Kyla P; Wise, Paul E; Soper, Nathaniel J; Zwischenberger, Joseph B; Lillemoe, Keith D; Dunnington, Gary L; Fryer, Jonathan P

    Intraoperative performance assessment of residents is of growing interest to trainees, faculty, and accreditors. Current approaches to collect such assessments are limited by low participation rates and long delays between procedure and evaluation. We deployed an innovative, smartphone-based tool, SIMPL (System for Improving and Measuring Procedural Learning), to make real-time intraoperative performance assessment feasible for every case in which surgical trainees participate, and hypothesized that SIMPL could be feasibly integrated into surgical training programs. Between September 1, 2015 and February 29, 2016, 15 U.S. general surgery residency programs were enrolled in an institutional review board-approved trial. SIMPL was made available after 70% of faculty and residents completed a 1-hour training session. Descriptive and univariate statistics analyzed multiple dimensions of feasibility, including training rates, volume of assessments, response rates/times, and dictation rates. The 20 most active residents and attendings were evaluated in greater detail. A total of 90% of eligible users (1267/1412) completed training. Further, 13/15 programs began using SIMPL. Totally, 6024 assessments were completed by 254 categorical general surgery residents (n = 3555 assessments) and 259 attendings (n = 2469 assessments), and 3762 unique operations were assessed. There was significant heterogeneity in participation within and between programs. Mean percentage (range) of users who completed ≥1, 5, and 20 assessments were 62% (21%-96%), 34% (5%-75%), and 10% (0%-32%) across all programs, and 96%, 75%, and 32% in the most active program. Overall, response rate was 70%, dictation rate was 24%, and mean response time was 12 hours. Assessments increased from 357 (September 2015) to 1146 (February 2016). The 20 most active residents each received mean 46 assessments by 10 attendings for 20 different procedures. SIMPL can be feasibly integrated into surgical training programs to enhance the frequency and timeliness of intraoperative performance assessment. We believe SIMPL could help facilitate a national competency-based surgical training system, although local and systemic challenges still need to be addressed. Copyright © 2016. Published by Elsevier Inc.

  4. [Comparison of simple pooling and bivariate model used in meta-analyses of diagnostic test accuracy published in Chinese journals].

    PubMed

    Huang, Yuan-sheng; Yang, Zhi-rong; Zhan, Si-yan

    2015-06-18

    To investigate the use of simple pooling and bivariate model in meta-analyses of diagnostic test accuracy (DTA) published in Chinese journals (January to November, 2014), compare the differences of results from these two models, and explore the impact of between-study variability of sensitivity and specificity on the differences. DTA meta-analyses were searched through Chinese Biomedical Literature Database (January to November, 2014). Details in models and data for fourfold table were extracted. Descriptive analysis was conducted to investigate the prevalence of the use of simple pooling method and bivariate model in the included literature. Data were re-analyzed with the two models respectively. Differences in the results were examined by Wilcoxon signed rank test. How the results differences were affected by between-study variability of sensitivity and specificity, expressed by I2, was explored. The 55 systematic reviews, containing 58 DTA meta-analyses, were included and 25 DTA meta-analyses were eligible for re-analysis. Simple pooling was used in 50 (90.9%) systematic reviews and bivariate model in 1 (1.8%). The remaining 4 (7.3%) articles used other models pooling sensitivity and specificity or pooled neither of them. Of the reviews simply pooling sensitivity and specificity, 41(82.0%) were at the risk of wrongly using Meta-disc software. The differences in medians of sensitivity and specificity between two models were both 0.011 (P<0.001, P=0.031 respectively). Greater differences could be found as I2 of sensitivity or specificity became larger, especially when I2>75%. Most DTA meta-analyses published in Chinese journals(January to November, 2014) combine the sensitivity and specificity by simple pooling. Meta-disc software can pool the sensitivity and specificity only through fixed-effect model, but a high proportion of authors think it can implement random-effect model. Simple pooling tends to underestimate the results compared with bivariate model. The greater the between-study variance is, the more likely the simple pooling has larger deviation. It is necessary to increase the knowledge level of statistical methods and software for meta-analyses of DTA data.

  5. The change and development of statistical methods used in research articles in child development 1930-2010.

    PubMed

    Køppe, Simo; Dammeyer, Jesper

    2014-09-01

    The evolution of developmental psychology has been characterized by the use of different quantitative and qualitative methods and procedures. But how does the use of methods and procedures change over time? This study explores the change and development of statistical methods used in articles published in Child Development from 1930 to 2010. The methods used in every article in the first issue of every volume were categorized into four categories. Until 1980 relatively simple statistical methods were used. During the last 30 years there has been an explosive use of more advanced statistical methods employed. The absence of statistical methods or use of simple methods had been eliminated.

  6. Descriptive Statistics: Reporting the Answers to the 5 Basic Questions of Who, What, Why, When, Where, and a Sixth, So What?

    PubMed

    Vetter, Thomas R

    2017-11-01

    Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"

  7. The Anatomy of American Football: Evidence from 7 Years of NFL Game Data

    PubMed Central

    Papalexakis, Evangelos

    2016-01-01

    How much does a fumble affect the probability of winning an American football game? How balanced should your offense be in order to increase the probability of winning by 10%? These are questions for which the coaching staff of National Football League teams have a clear qualitative answer. Turnovers are costly; turn the ball over several times and you will certainly lose. Nevertheless, what does “several” mean? How “certain” is certainly? In this study, we collected play-by-play data from the past 7 NFL seasons, i.e., 2009–2015, and we build a descriptive model for the probability of winning a game. Despite the fact that our model incorporates simple box score statistics, such as total offensive yards, number of turnovers etc., its overall cross-validation accuracy is 84%. Furthermore, we combine this descriptive model with a statistical bootstrap module to build FPM (short for Football Prediction Matchup) for predicting future match-ups. The contribution of FPM is pertinent to its simplicity and transparency, which however does not sacrifice the system’s performance. In particular, our evaluations indicate that our prediction engine performs on par with the current state-of-the-art systems (e.g., ESPN’s FPI and Microsoft’s Cortana). The latter are typically proprietary but based on their components described publicly they are significantly more complicated than FPM. Moreover, their proprietary nature does not allow for a head-to-head comparison in terms of the core elements of the systems but it should be evident that the features incorporated in FPM are able to capture a large percentage of the observed variance in NFL games. PMID:28005971

  8. 26 CFR 25.6019-4 - Description of property listed on return.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., amount of principal unpaid, rate of interest and whether simple or compound, and date to which interest..., rate of interest, date or dates on which interest is payable, series number where there is more than... amount of such accrued income shall be separately set forth. Description of the seller's interest in land...

  9. Crowd Sourcing Data Collection through Amazon Mechanical Turk

    DTIC Science & Technology

    2013-09-01

    The first recognition study consisted of a Panel Study using a simple detection protocol, in which participants were presented with vignettes and, for...variability than the crowdsourcing data set, hewing more closely to the year 1 verbs of interest and simple description grammar . The DT:PS data were...Study RT: PS Recognition Task: Panel Study RT: RT Recognition Task: Round Table S3 Amazon Simple Storage Service SVPA Single Verb Present /Absent

  10. Significant Pre-Accession Factors Predicting Success or Failure During a Marine Corps Officer’s Initial Service Obligation

    DTIC Science & Technology

    2015-12-01

    WAIVERS ..............................................................................................49  APPENDIX C. DESCRIPTIVE STATISTICS ... Statistics of Dependent Variables. .............................................23  Table 6.  Summary Statistics of Academics Variables...24  Table 7.  Summary Statistics of Application Variables ............................................25  Table 8

  11. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  12. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  13. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  14. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  15. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  16. Nurses’ commitment to respecting patient dignity

    PubMed Central

    Raee, Zahra; Abedi, Heidarali; Shahriari, Mohsen

    2017-01-01

    Background: Although respecting human dignity is a cornerstone of all nursing practices, industrialization has gradually decreased the attention paid to this subject in nursing care. Therefore, the present study aimed to investigate nurses’ commitment to respecting patient dignity in hospitals of Isfahan, Iran. Methods: This descriptive-analytical study was conducted in hospitals of Isfahan. Overall, 401 inpatients were selected by cluster sampling and then selected simple random sampling from different wards. Data were collected through a questionnaire containing the components of patient dignity, that is, patient-nurse relationships, privacy, and independence. All items were scored based on a five-point Likert scale. The collected data were analyzed using descriptive statistics and Chi-square tests. P < 0.05 were considered significant in all analyses. Findings: Most patients (91%) scored their relationships with nurses as good. Moreover, 91.8% of the participants described privacy protection as moderate/good. Only 6.5% of the subjects rated it as excellent. The majority of the patients (84.4%) believed their independence was maintained. These subjects also approved of taking part in decision-making. Conclusion: According to our findings, nurses respected patient dignity to an acceptable level. However, the conditions were less favorable in public hospitals and emergency departments. Nursing authorities and policy makers are thus required to introduce appropriate measures to improve the existing conditions. PMID:28546981

  17. Nurses' commitment to respecting patient dignity.

    PubMed

    Raee, Zahra; Abedi, Heidarali; Shahriari, Mohsen

    2017-01-01

    Although respecting human dignity is a cornerstone of all nursing practices, industrialization has gradually decreased the attention paid to this subject in nursing care. Therefore, the present study aimed to investigate nurses' commitment to respecting patient dignity in hospitals of Isfahan, Iran. This descriptive-analytical study was conducted in hospitals of Isfahan. Overall, 401 inpatients were selected by cluster sampling and then selected simple random sampling from different wards. Data were collected through a questionnaire containing the components of patient dignity, that is, patient-nurse relationships, privacy, and independence. All items were scored based on a five-point Likert scale. The collected data were analyzed using descriptive statistics and Chi-square tests. P < 0.05 were considered significant in all analyses. Most patients (91%) scored their relationships with nurses as good. Moreover, 91.8% of the participants described privacy protection as moderate/good. Only 6.5% of the subjects rated it as excellent. The majority of the patients (84.4%) believed their independence was maintained. These subjects also approved of taking part in decision-making. According to our findings, nurses respected patient dignity to an acceptable level. However, the conditions were less favorable in public hospitals and emergency departments. Nursing authorities and policy makers are thus required to introduce appropriate measures to improve the existing conditions.

  18. Collective Intelligence: Aggregation of Information from Neighbors in a Guessing Game.

    PubMed

    Pérez, Toni; Zamora, Jordi; Eguíluz, Víctor M

    2016-01-01

    Complex systems show the capacity to aggregate information and to display coordinated activity. In the case of social systems the interaction of different individuals leads to the emergence of norms, trends in political positions, opinions, cultural traits, and even scientific progress. Examples of collective behavior can be observed in activities like the Wikipedia and Linux, where individuals aggregate their knowledge for the benefit of the community, and citizen science, where the potential of collectives to solve complex problems is exploited. Here, we conducted an online experiment to investigate the performance of a collective when solving a guessing problem in which each actor is endowed with partial information and placed as the nodes of an interaction network. We measure the performance of the collective in terms of the temporal evolution of the accuracy, finding no statistical difference in the performance for two classes of networks, regular lattices and random networks. We also determine that a Bayesian description captures the behavior pattern the individuals follow in aggregating information from neighbors to make decisions. In comparison with other simple decision models, the strategy followed by the players reveals a suboptimal performance of the collective. Our contribution provides the basis for the micro-macro connection between individual based descriptions and collective phenomena.

  19. Collective Intelligence: Aggregation of Information from Neighbors in a Guessing Game

    PubMed Central

    Pérez, Toni; Zamora, Jordi; Eguíluz, Víctor M.

    2016-01-01

    Complex systems show the capacity to aggregate information and to display coordinated activity. In the case of social systems the interaction of different individuals leads to the emergence of norms, trends in political positions, opinions, cultural traits, and even scientific progress. Examples of collective behavior can be observed in activities like the Wikipedia and Linux, where individuals aggregate their knowledge for the benefit of the community, and citizen science, where the potential of collectives to solve complex problems is exploited. Here, we conducted an online experiment to investigate the performance of a collective when solving a guessing problem in which each actor is endowed with partial information and placed as the nodes of an interaction network. We measure the performance of the collective in terms of the temporal evolution of the accuracy, finding no statistical difference in the performance for two classes of networks, regular lattices and random networks. We also determine that a Bayesian description captures the behavior pattern the individuals follow in aggregating information from neighbors to make decisions. In comparison with other simple decision models, the strategy followed by the players reveals a suboptimal performance of the collective. Our contribution provides the basis for the micro-macro connection between individual based descriptions and collective phenomena. PMID:27093274

  20. Prevalence of clinical trial status discrepancies: A cross-sectional study of 10,492 trials registered on both ClinicalTrials.gov and the European Union Clinical Trials Register.

    PubMed

    Fleminger, Jessica; Goldacre, Ben

    2018-01-01

    Trial registries are a key source of information for clinicians and researchers. While building OpenTrials, an open database of public trial information, we identified errors and omissions in registries, including discrepancies between descriptions of the same trial in different registries. We set out to ascertain the prevalence of discrepancies in trial completion status using a cohort of trials registered on both the European Union Clinical Trials Register (EUCTR) and ClinicalTrials.gov. We used matching titles and registry IDs provided by both registries to build a cohort of dual-registered trials. Completion statuses were compared; we calculated descriptive statistics on the prevalence of discrepancies. 11,988 dual-registered trials were identified. 1,496 did not provide a comparable completion status, leaving 10,492 trials. 16.2% were discrepant on completion status. The majority of discrepancies (90.5%) were a 'completed' trial on ClinicalTrials.gov inaccurately marked as 'ongoing' on EUCTR. Overall, 33.9% of dual-registered trials described as 'ongoing' on EUCTR were listed as 'completed' on ClinicalTrials.gov. Completion status on registries is commonly inaccurate. Previous work on publication bias may underestimate non-reporting. We describe simple steps registry owners and trialists could take to improve accuracy.

  1. Job satisfaction among public health nurses: a national survey.

    PubMed

    Curtis, Elizabeth A; Glacken, Michele

    2014-07-01

    Despite increasing interest in nurses' job satisfaction relatively few studies have investigated job satisfaction among public health nurses. To establish current level of job satisfaction among public health nurses and identify the main contributing variables/factors to job satisfaction among this population. Quantitative descriptive design. A simple random sample of 1000 public health nurses was conducted yielding a response rate of 35.1% (n = 351). Data was collected using the Index of Work Satisfaction Questionnaire. Descriptive and inferential statistics were deployed. Low levels of job satisfaction among public health nurses emerged. Professional status, interaction and autonomy contributed most to job satisfaction while pay and task-related activities contributed least. Age and tenure were the only biographic factors that correlated significantly with job satisfaction. Public health nurse managers/leaders need to find creative ways of improving the factors that contribute to job satisfaction and address robustly those factors that result in low job satisfaction. The critical issue for public health nurse managers is to determine how job satisfaction can be improved. Greater collaboration and consultation between managers and public health nurses can be regarded as a useful way to begin this process, especially if contemporary nursing is to embrace a responsive approach within the profession. © 2012 John Wiley & Sons Ltd.

  2. One Yard Below: Education Statistics from a Different Angle.

    ERIC Educational Resources Information Center

    Education Intelligence Agency, Carmichael, CA.

    This report offers a different perspective on education statistics by highlighting rarely used "stand-alone" statistics on public education, inputs, outputs, and descriptions, and it uses interactive statistics that combine two or more statistics in an unusual way. It is a report that presents much evidence, but few conclusions. It is not intended…

  3. A Bibliography of Statistical Applications in Geography, Technical Paper No. 9.

    ERIC Educational Resources Information Center

    Greer-Wootten, Bryn; And Others

    Included in this bibliography are resource materials available to both college instructors and students on statistical applications in geographic research. Two stages of statistical development are treated in the bibliography. They are 1) descriptive statistics, in which the sample is the focus of interest, and 2) analytical statistics, in which…

  4. A simple stochastic weather generator for ecological modeling

    Treesearch

    A.G. Birt; M.R. Valdez-Vivas; R.M. Feldman; C.W. Lafon; D. Cairns; R.N. Coulson; M. Tchakerian; W. Xi; Jim Guldin

    2010-01-01

    Stochastic weather generators are useful tools for exploring the relationship between organisms and their environment. This paper describes a simple weather generator that can be used in ecological modeling projects. We provide a detailed description of methodology, and links to full C++ source code (http://weathergen.sourceforge.net) required to implement or modify...

  5. A Simple Treatment of the Liquidity Trap for Intermediate Macroeconomics Courses

    ERIC Educational Resources Information Center

    Buttet, Sebastien; Roy, Udayan

    2014-01-01

    Several leading undergraduate intermediate macroeconomics textbooks now include a simple reduced-form New Keynesian model of short-run dynamics (alongside the IS-LM model). Unfortunately, there is no accompanying description of how the zero lower bound on nominal interest rates affects the model. In this article, the authors show how the…

  6. A Writing Intervention to Teach Simple Sentences and Descriptive Paragraphs to Adolescents with Writing Difficulties

    ERIC Educational Resources Information Center

    Datchuk, Shawn M.; Kubina, Richard M., Jr.

    2017-01-01

    The present study used a multiple-baseline, single-case experimental design to investigate the effects of a multicomponent intervention on construction of simple sentences and word sequences. The intervention entailed sequential delivery of sentence instruction and frequency building to a performance criterion and paragraph instruction.…

  7. Research Education in Undergraduate Occupational Therapy Programs.

    ERIC Educational Resources Information Center

    Petersen, Paul; And Others

    1992-01-01

    Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)

  8. Policy Safeguards and the Legitimacy of Highway Interdiction

    DTIC Science & Technology

    2016-12-01

    17 B. BIAS WITHIN LAW ENFORCEMENT ..............................................19 C. STATISTICAL DATA GATHERING...32 3. Controlling Discretion .................................................................36 4. Statistical Data Collection for Traffic Stops...49 A. DESCRIPTION OF STATISTICAL DATA COLLECTED ...............50 B. DATA ORGANIZATION AND ANALYSIS

  9. A Simple and Practical Index to Measure Dementia-Related Quality of Life.

    PubMed

    Arons, Alexander M M; Schölzel-Dorenbos, Carla J M; Olde Rikkert, Marcel G M; Krabbe, Paul F M

    2016-01-01

    Research on new treatments for dementia is gaining pace worldwide in an effort to alleviate this growing health care problem. The optimal evaluation of such interventions, however, calls for a practical and credible patient-reported outcome measure. To describe the refinement of the Dementia Quality-of-life Instrument (DQI) and present its revised version. A prototype of the DQI was adapted to cover a broader range of health-related quality of life (HRQOL) and to improve consistency in the descriptions of its domains. A valuation study was then conducted to assign meaningful numbers to all DQI health states. Pairs of DQI states were presented to a sample of professionals working with people with dementia and a representative sample of the Dutch population. They had to repeatedly select the best DQI state, and their responses were statistically modeled to obtain values for each health state. In total, 207 professionals working with people with dementia and 631 members of the general population completed the paired comparison tasks. Statistically significant differences between the two samples were found for the domains of social functioning, mood, and memory. Severe problems with physical health and severe memory problems were deemed most important by the general population. In contrast, severe mood problems were considered most important by professionals working with people with dementia. The DQI is a simple and feasible measurement instrument that expresses the overall HRQOL of people suffering from dementia in a single meaningful number. Current results suggest that revisiting the discussion of using values from the general population might be warranted in the dementia context. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Fish: A New Computer Program for Friendly Introductory Statistics Help

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Raffle, Holly

    2005-01-01

    All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…

  11. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis.

    PubMed

    Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B

    2012-01-20

    Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.

  12. Meta-analyses and Forest plots using a microsoft excel spreadsheet: step-by-step guide focusing on descriptive data analysis

    PubMed Central

    2012-01-01

    Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277

  13. Football fever: goal distributions and non-Gaussian statistics

    NASA Astrophysics Data System (ADS)

    Bittner, E.; Nußbaumer, A.; Janke, W.; Weigel, M.

    2009-02-01

    Analyzing football score data with statistical techniques, we investigate how the not purely random, but highly co-operative nature of the game is reflected in averaged properties such as the probability distributions of scored goals for the home and away teams. As it turns out, especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. Instead, a good effective description of the data is provided by less basic distributions such as the negative binomial one or the probability densities of extreme value statistics. To understand this behavior from a microscopical point of view, however, no waiting time problem or extremal process need be invoked. Instead, modifying the Bernoulli random process underlying the Poissonian model to include a simple component of self-affirmation seems to describe the data surprisingly well and allows to understand the observed deviation from Gaussian statistics. The phenomenological distributions used before can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the “FIFA World Cup” series, and found the proposed models to be applicable rather universally. In particular, here we analyze the results of the German women’s premier football league and consider the two separate German men’s premier leagues in the East and West during the cold war times as well as the unified league after 1990 to see how scoring in football and the component of self-affirmation depend on cultural and political circumstances.

  14. Acceptability of HIV/AIDS testing among pre-marital couples in Iran (2012)

    PubMed Central

    Ayatollahi, Jamshid; Nasab Sarab, Mohammad Ali Bagheri; Sharifi, Mohammad Reza; Shahcheraghi, Seyed Hossein

    2014-01-01

    Background: Human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) is a lifestyle-related disease. This disease is transmitted through unprotected sex, contaminated needles, infected blood transfusion and from mother to child during pregnancy and delivery. Prevention of infection with HIV, mainly through safe sex and needle exchange programmes is a solution to prevent the spread of the disease. Knowledge about HIV state helps to prevent and subsequently reduce the harm to the later generation. The purpose of this study was to assess the willingness rate of couples referred to the family regulation pre-marital counselling centre for performing HIV test before marriage in Yazd. Patients and Methods: In this descriptive study, a simple random sampling was done among people referred to Akbari clinic. The couples were 1000 men and 1000 women referred to the premarital counselling centre for pre-marital HIV testing in Yazd in the year 2012. They were in situations of pregnancy, delivery or nursing and milking. The data were analyzed using Statistical Package for the Social Sciences (SPSS) software and chi-square statistical test. Results: There was a significant statistical difference between the age groups about willingness for HIV testing before marriage (P < 0.001) and also positive comments about HIV testing in asymptomatic individuals (P < 0.001). This study also proved a significant statistical difference between the two gender groups about willingness to marry after HIV positive test of their wives. Conclusion: The willingness rate of couples to undergo HIV testing before marriage was significant. Therefore, HIV testing before marriage as a routine test was suggested. PMID:25114363

  15. Acceptability of HIV/AIDS testing among pre-marital couples in Iran (2012).

    PubMed

    Ayatollahi, Jamshid; Nasab Sarab, Mohammad Ali Bagheri; Sharifi, Mohammad Reza; Shahcheraghi, Seyed Hossein

    2014-07-01

    Human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) is a lifestyle-related disease. This disease is transmitted through unprotected sex, contaminated needles, infected blood transfusion and from mother to child during pregnancy and delivery. Prevention of infection with HIV, mainly through safe sex and needle exchange programmes is a solution to prevent the spread of the disease. Knowledge about HIV state helps to prevent and subsequently reduce the harm to the later generation. The purpose of this study was to assess the willingness rate of couples referred to the family regulation pre-marital counselling centre for performing HIV test before marriage in Yazd. In this descriptive study, a simple random sampling was done among people referred to Akbari clinic. The couples were 1000 men and 1000 women referred to the premarital counselling centre for pre-marital HIV testing in Yazd in the year 2012. They were in situations of pregnancy, delivery or nursing and milking. The data were analyzed using Statistical Package for the Social Sciences (SPSS) software and chi-square statistical test. There was a significant statistical difference between the age groups about willingness for HIV testing before marriage (P < 0.001) and also positive comments about HIV testing in asymptomatic individuals (P < 0.001). This study also proved a significant statistical difference between the two gender groups about willingness to marry after HIV positive test of their wives. The willingness rate of couples to undergo HIV testing before marriage was significant. Therefore, HIV testing before marriage as a routine test was suggested.

  16. How authors did it - a methodological analysis of recent engineering education research papers in the European Journal of Engineering Education

    NASA Astrophysics Data System (ADS)

    Malmi, Lauri; Adawi, Tom; Curmi, Ronald; de Graaff, Erik; Duffy, Gavin; Kautz, Christian; Kinnunen, Päivi; Williams, Bill

    2018-03-01

    We investigated research processes applied in recent publications in the European Journal of Engineering Education (EJEE), exploring how papers link to theoretical work and how research processes have been designed and reported. We analysed all 155 papers published in EJEE in 2009, 2010 and 2013, classifying the papers using a taxonomy of research processes in engineering education research (EER) (Malmi et al. 2012). The majority of the papers presented either empirical work (59%) or were case reports (27%). Our main findings are as follows: (1) EJEE papers build moderately on a wide selection of theoretical work; (2) a great majority of papers have a clear research strategy, but data analysis methods are mostly simple descriptive statistics or simple/undocumented qualitative research methods; and (3) there are significant shortcomings in reporting research questions, methodology and limitations of studies. Our findings are consistent with and extend analyses of EER papers in other publishing venues; they help to build a clearer picture of the research currently published in EJEE and allow us to make recommendations for consideration by the editorial team of the journal. Our employed procedure also provides a framework that can be applied to monitor future global evolution of this and other EER journals.

  17. Nomogram for sample size calculation on a straightforward basis for the kappa statistic.

    PubMed

    Hong, Hyunsook; Choi, Yunhee; Hahn, Seokyung; Park, Sue Kyung; Park, Byung-Joo

    2014-09-01

    Kappa is a widely used measure of agreement. However, it may not be straightforward in some situation such as sample size calculation due to the kappa paradox: high agreement but low kappa. Hence, it seems reasonable in sample size calculation that the level of agreement under a certain marginal prevalence is considered in terms of a simple proportion of agreement rather than a kappa value. Therefore, sample size formulae and nomograms using a simple proportion of agreement rather than a kappa under certain marginal prevalences are proposed. A sample size formula was derived using the kappa statistic under the common correlation model and goodness-of-fit statistic. The nomogram for the sample size formula was developed using SAS 9.3. The sample size formulae using a simple proportion of agreement instead of a kappa statistic and nomograms to eliminate the inconvenience of using a mathematical formula were produced. A nomogram for sample size calculation with a simple proportion of agreement should be useful in the planning stages when the focus of interest is on testing the hypothesis of interobserver agreement involving two raters and nominal outcome measures. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. [Comparative study of the repair of full thickness tear of the supraspinatus by means of "single row" or "suture bridge" techniques].

    PubMed

    Arroyo-Hernández, M; Mellado-Romero, M A; Páramo-Díaz, P; Martín-López, C M; Cano-Egea, J M; Vilá Y Rico, J

    2015-01-01

    The purpose of this study is to analyze if there is any difference between the arthroscopic reparation of full-thickness supraspinatus tears with simple row technique versus suture bridge technique. We accomplished a retrospective study of 123 patients with full-thickness supraspinatus tears between January 2009 and January 2013 in our hospital. There were 60 simple row reparations, and 63 suture bridge ones. The mean age in the simple row group was 62.9, and in the suture bridge group was 63.3 years old. There were more women than men in both groups (67%). All patients were studied using the Constant test. The mean Constant test in the suture bridge group was 76.7, and in the simple row group was 72.4. We have also accomplished a statistical analysis of each Constant item. Strength was higher in the suture bridge group, with a significant statistical difference (p 0.04). The range of movement was also greater in the suture bridge group, but was not statistically significant. Suture bridge technique has better clinical results than single row reparations, but the difference is not statistically significant (p = 0.298).

  19. A Study of Strengths and Weaknesses of Descriptive Assessment from Principals, Teachers and Experts Points of View in Chaharmahal and Bakhteyari Primary Schools

    ERIC Educational Resources Information Center

    Sharief, Mostafa; Naderi, Mahin; Hiedari, Maryam Shoja; Roodbari, Omolbanin; Jalilvand, Mohammad Reza

    2012-01-01

    The aim of current study is to determine the strengths and weaknesses of descriptive evaluation from the viewpoint of principals, teachers and experts of Chaharmahal and Bakhtiari province. A descriptive survey was performed. Statistical population includes 208 principals, 303 teachers, and 100 executive experts of descriptive evaluation scheme in…

  20. Intercomparison of textural parameters of intertidal sediments generated by different statistical procedures, and implications for a unifying descriptive nomenclature

    NASA Astrophysics Data System (ADS)

    Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai

    2015-06-01

    Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.

  1. The Attentional Drift Diffusion Model of Simple Perceptual Decision-Making.

    PubMed

    Tavares, Gabriela; Perona, Pietro; Rangel, Antonio

    2017-01-01

    Perceptual decisions requiring the comparison of spatially distributed stimuli that are fixated sequentially might be influenced by fluctuations in visual attention. We used two psychophysical tasks with human subjects to investigate the extent to which visual attention influences simple perceptual choices, and to test the extent to which the attentional Drift Diffusion Model (aDDM) provides a good computational description of how attention affects the underlying decision processes. We find evidence for sizable attentional choice biases and that the aDDM provides a reasonable quantitative description of the relationship between fluctuations in visual attention, choices and reaction times. We also find that exogenous manipulations of attention induce choice biases consistent with the predictions of the model.

  2. Statistical Tutorial | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data.  ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018.  The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean

  3. Truth and probability in evolutionary games

    NASA Astrophysics Data System (ADS)

    Barrett, Jeffrey A.

    2017-01-01

    This paper concerns two composite Lewis-Skyrms signalling games. Each consists in a base game that evolves a language descriptive of nature and a metagame that coevolves a language descriptive of the base game and its evolving language. The first composite game shows how a pragmatic notion of truth might coevolve with a simple descriptive language. The second shows how a pragmatic notion of probability might similarly coevolve. Each of these pragmatic notions is characterised by the particular game and role that it comes to play in the game.

  4. Patients with persistent medically unexplained physical symptoms: a descriptive study from Norwegian general practice.

    PubMed

    Aamland, Aase; Malterud, Kirsti; Werner, Erik L

    2014-05-29

    Further research on effective interventions for patients with peristent Medically Unexplained Physical Symptoms (MUPS) in general practice is needed. Prevalence estimates of such patients are conflicting, and other descriptive knowledge is needed for development and evaluation of effective future interventions. In this study, we aimed to estimate the consultation prevalence of patients with persistent MUPS in general practice, including patients' characteristics and symptom pattern, employment status and use of social benefits, and the general practitioners' (GPs) management strategy. During a four-week period the participating Norwegian GPs (n=84) registered all consultations with patients who met a strict definition of MUPS (>3 months duration and function loss), using a questionnaire with simple tick-off questions. Analyses were performed with descriptive statistics for all variables and split analysis on gender and age. The GPs registered 526 patients among their total of 17 688 consultations, giving a consultation prevalence of persistent MUPS of 3%. The mean age of patients was 46 years, and 399 (76%) were women. The most frequent group of symptoms was musculoskeletal problems, followed by asthenia/fatigue. There was no significant gender difference in symptom pattern. Almost half of the patients were currently working (45%), significantly more men. The major GP management strategy was supportive counseling. A consultation prevalence rate of 3% implies that patients with persistent MUPS are common in general practice. Our study disclosed heterogeneity among the patients such as differences in employment status, which emphasizes the importance of personalized focus rather than unsubstantiated stereotyping of "MUPS patients" as a group.

  5. Class and Home Problems: Humidification, a True "Home" Problem for p. Chemical Engineer

    ERIC Educational Resources Information Center

    Condoret, Jean-Stephane

    2012-01-01

    The problem of maintaining hygrothermal comfort in a house is addressed using the chemical engineer's toolbox. A simple dynamic modelling proved to give a good description of the humidification of the house in winter, using a domestic humidifier. Parameters of the model were identified from a simple experiment. Surprising results, especially…

  6. Sexual Assault Prevention and Response Climate DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    DEOCS, (7) examining variance and descriptive statistics (8) examining the relationship among items/areas to reduce multicollinearity, and (9...selecting items that demonstrate the strongest scale properties. Included is a review of the 4.0 description and items, followed by the proposed...Tables 1 – 7 for the description of each measure and corresponding items. Table 1. DEOCS 4.0 Perceptions of Safety Measure Description

  7. Radiation from quantum weakly dynamical horizons in loop quantum gravity.

    PubMed

    Pranzetti, Daniele

    2012-07-06

    We provide a statistical mechanical analysis of quantum horizons near equilibrium in the grand canonical ensemble. By matching the description of the nonequilibrium phase in terms of weakly dynamical horizons with a local statistical framework, we implement loop quantum gravity dynamics near the boundary. The resulting radiation process provides a quantum gravity description of the horizon evaporation. For large black holes, the spectrum we derive presents a discrete structure which could be potentially observable.

  8. A Technology-Based Statistical Reasoning Assessment Tool in Descriptive Statistics for Secondary School Students

    ERIC Educational Resources Information Center

    Chan, Shiau Wei; Ismail, Zaleha

    2014-01-01

    The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…

  9. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    PubMed

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  10. 34 CFR 668.49 - Institutional fire safety policies and fire statistics.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Institutional fire safety policies and fire statistics... fire statistics. (a) Additional definitions that apply to this section. Cause of fire: The factor or... statistics described in paragraph (c) of this section. (2) A description of each on-campus student housing...

  11. Statistical Properties of Real-Time Amplitude Estimate of Harmonics Affected by Frequency Instability

    NASA Astrophysics Data System (ADS)

    Bellan, Diego; Pignari, Sergio A.

    2016-07-01

    This work deals with the statistical characterization of real-time digital measurement of the amplitude of harmonics affected by frequency instability. In fact, in modern power systems both the presence of harmonics and frequency instability are well-known and widespread phenomena mainly due to nonlinear loads and distributed generation, respectively. As a result, real-time monitoring of voltage/current frequency spectra is of paramount importance as far as power quality issues are addressed. Within this framework, a key point is that in many cases real-time continuous monitoring prevents the application of sophisticated algorithms to extract all the information from the digitized waveforms because of the required computational burden. In those cases only simple evaluations such as peak search of discrete Fourier transform are implemented. It is well known, however, that a slight change in waveform frequency results in lack of sampling synchronism and uncertainty in amplitude estimate. Of course the impact of this phenomenon increases with the order of the harmonic to be measured. In this paper an approximate analytical approach is proposed in order to describe the statistical properties of the measured magnitude of harmonics affected by frequency instability. By providing a simplified description of the frequency behavior of the windows used against spectral leakage, analytical expressions for mean value, variance, cumulative distribution function, and probability density function of the measured harmonics magnitude are derived in closed form as functions of waveform frequency treated as a random variable.

  12. Automated detection of hospital outbreaks: A systematic review of methods.

    PubMed

    Leclère, Brice; Buckeridge, David L; Boëlle, Pierre-Yves; Astagneau, Pascal; Lepelletier, Didier

    2017-01-01

    Several automated algorithms for epidemiological surveillance in hospitals have been proposed. However, the usefulness of these methods to detect nosocomial outbreaks remains unclear. The goal of this review was to describe outbreak detection algorithms that have been tested within hospitals, consider how they were evaluated, and synthesize their results. We developed a search query using keywords associated with hospital outbreak detection and searched the MEDLINE database. To ensure the highest sensitivity, no limitations were initially imposed on publication languages and dates, although we subsequently excluded studies published before 2000. Every study that described a method to detect outbreaks within hospitals was included, without any exclusion based on study design. Additional studies were identified through citations in retrieved studies. Twenty-nine studies were included. The detection algorithms were grouped into 5 categories: simple thresholds (n = 6), statistical process control (n = 12), scan statistics (n = 6), traditional statistical models (n = 6), and data mining methods (n = 4). The evaluation of the algorithms was often solely descriptive (n = 15), but more complex epidemiological criteria were also investigated (n = 10). The performance measures varied widely between studies: e.g., the sensitivity of an algorithm in a real world setting could vary between 17 and 100%. Even if outbreak detection algorithms are useful complementary tools for traditional surveillance, the heterogeneity in results among published studies does not support quantitative synthesis of their performance. A standardized framework should be followed when evaluating outbreak detection methods to allow comparison of algorithms across studies and synthesis of results.

  13. Statistical Properties of Maximum Likelihood Estimators of Power Law Spectra Information

    NASA Technical Reports Server (NTRS)

    Howell, L. W.

    2002-01-01

    A simple power law model consisting of a single spectral index, a is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at the knee energy, E(sub k), to a steeper spectral index alpha(sub 2) greater than alpha(sub 1) above E(sub k). The Maximum likelihood (ML) procedure was developed for estimating the single parameter alpha(sub 1) of a simple power law energy spectrum and generalized to estimate the three spectral parameters of the broken power law energy spectrum from simulated detector responses and real cosmic-ray data. The statistical properties of the ML estimator were investigated and shown to have the three desirable properties: (P1) consistency (asymptotically unbiased). (P2) efficiency asymptotically attains the Cramer-Rao minimum variance bound), and (P3) asymptotically normally distributed, under a wide range of potential detector response functions. Attainment of these properties necessarily implies that the ML estimation procedure provides the best unbiased estimator possible. While simulation studies can easily determine if a given estimation procedure provides an unbiased estimate of the spectra information, and whether or not the estimator is approximately normally distributed, attainment of the Cramer-Rao bound (CRB) can only he ascertained by calculating the CRB for an assumed energy spectrum-detector response function combination, which can be quite formidable in practice. However. the effort in calculating the CRB is very worthwhile because it provides the necessary means to compare the efficiency of competing estimation techniques and, furthermore, provides a stopping rule in the search for the best unbiased estimator. Consequently, the CRB for both the simple and broken power law energy spectra are derived herein and the conditions under which they are attained in practice are investigated. The ML technique is then extended to estimate spectra information from an arbitrary number of astrophysics data sets produced by vastly different science instruments. This theory and its successful implementation will facilitate the interpretation of spectral information from multiple astrophysics missions and thereby permit the derivation of superior spectral parameter estimates based on the combination of data sets.

  14. On the use of Lagrangian variables in descriptions of unsteady boundary-layer separation

    NASA Technical Reports Server (NTRS)

    Cowley, Stephen J.; Vandommelen, Leon L.; Lam, Shui T.

    1990-01-01

    The Lagrangian description of unsteady boundary layer separation is reviewed from both analytical and numerical perspectives. It is explained in simple terms how particle distortion gives rise to unsteady separation, and why a theory centered on Lagrangian coordinates provides the clearest description of this phenomenon. Some of the more recent results for unsteady three dimensional compressible separation are included. The different forms of separation that can arise from symmetries are emphasized. A possible description of separation is also included when the detaching vorticity layer exits the classical boundary layer region, but still remains much closer to the surface than a typical body-lengthscale.

  15. FAST COGNITIVE AND TASK ORIENTED, ITERATIVE DATA DISPLAY (FACTOID)

    DTIC Science & Technology

    2017-06-01

    approaches. As a result, the following assumptions guided our efforts in developing modeling and descriptive metrics for evaluation purposes...Application Evaluation . Our analytic workflow for evaluation is to first provide descriptive statistics about applications across metrics (performance...distributions for evaluation purposes because the goal of evaluation is accurate description , not inference (e.g., prediction). Outliers depicted

  16. Mathematical and Statistical Software Index. Final Report.

    ERIC Educational Resources Information Center

    Black, Doris E., Comp.

    Brief descriptions are provided of general-purpose mathematical and statistical software, including 27 "stand-alone" programs, three subroutine systems, and two nationally recognized statistical packages, which are available in the Air Force Human Resources Laboratory (AFHRL) software library. This index was created to enable researchers…

  17. Education Statistics Quarterly, Spring 2001.

    ERIC Educational Resources Information Center

    Education Statistics Quarterly, 2001

    2001-01-01

    The "Education Statistics Quarterly" gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products and funding opportunities developed over a 3-month period. Each issue…

  18. The attentional drift-diffusion model extends to simple purchasing decisions.

    PubMed

    Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio

    2012-01-01

    How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions.

  19. The Attentional Drift-Diffusion Model Extends to Simple Purchasing Decisions

    PubMed Central

    Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio

    2012-01-01

    How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions. PMID:22707945

  20. Projectiles, pendula, and special relativity

    NASA Astrophysics Data System (ADS)

    Price, Richard H.

    2005-05-01

    The kind of flat-earth gravity used in introductory physics appears in an accelerated reference system in special relativity. From this viewpoint, we work out the special relativistic description of a ballistic projectile and a simple pendulum, two examples of simple motion driven by earth-surface gravity. The analysis uses only the basic mathematical tools of special relativity typical of a first-year university course.

  1. Radar derived spatial statistics of summer rain. Volume 1: Experiment description

    NASA Technical Reports Server (NTRS)

    Katz, I.; Arnold, A.; Goldhirsh, J.; Konrad, T. G.; Vann, W. L.; Dobson, E. B.; Rowland, J. R.

    1975-01-01

    An experiment was performed at Wallops Island, Virginia, to obtain a statistical description of summer rainstorms. Its purpose was to obtain information needed for design of earth and space communications systems in which precipitation in the earth's atmosphere scatters or attenuates the radio signal. Rainstorms were monitored with the high resolution SPANDAR radar and the 3-dimensional structures of the storms were recorded on digital tape. The equipment, the experiment, and tabulated data obtained during the experiment are described.

  2. [Bayesian statistics in medicine -- part II: main applications and inference].

    PubMed

    Montomoli, C; Nichelatti, M

    2008-01-01

    Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.

  3. Deriving Vegetation Dynamics of Natural Terrestrial Ecosystems from MODIS NDVI/EVI Data over Turkey.

    PubMed

    Evrendilek, Fatih; Gulbeyaz, Onder

    2008-09-01

    The 16-day composite MODIS vegetation indices (VIs) at 500-m resolution for the period between 2000 to 2007 were seasonally averaged on the basis of the estimated distribution of 16 potential natural terrestrial ecosystems (NTEs) across Turkey. Graphical and statistical analyses of the time-series VIs for the NTEs spatially disaggregated in terms of biogeoclimate zones and land cover types included descriptive statistics, correlations, discrete Fourier transform (DFT), time-series decomposition, and simple linear regression (SLR) models. Our spatio-temporal analyses revealed that both MODIS VIs, on average, depicted similar seasonal variations for the NTEs, with the NDVI values having higher mean and SD values. The seasonal VIs were most correlated in decreasing order for: barren/sparsely vegetated land > grassland > shrubland/woodland > forest; (sub)nival > warm temperate > alpine > cool temperate > boreal = Mediterranean; and summer > spring > autumn > winter. Most pronounced differences between the MODIS VI responses over Turkey occurred in boreal and Mediterranean climate zones and forests, and in winter (the senescence phase of the growing season). Our results showed the potential of the time-series MODIS VI datasets in the estimation and monitoring of seasonal and interannual ecosystem dynamics over Turkey that needs to be further improved and refined through systematic and extensive field measurements and validations across various biomes.

  4. Distinguishing Man from Molecules: The Distinctiveness of Medical Concepts at Different Levels of Description

    PubMed Central

    Cole, William G.; Michael, Patricia; Blois, Marsden S.

    1987-01-01

    A computer program was created to use information about the statistical distribution of words in journal abstracts to make probabilistic judgments about the level of description (e.g. molecular, cell, organ) of medical text. Statistical analysis of 7,409 journal abstracts taken from three medical journals representing distinct levels of description revealed that many medical words seem to be highly specific to one or another level of description. For example, the word adrenoreceptors occurred only in the American Journal of Physiology, never in Journal of Biological Chemistry or in Journal of American Medical Association. Such highly specific words occured so frequently that the automatic classification program was able to classify correctly 45 out of 45 test abstracts, with 100% confidence. These findings are interpreted in terms of both a theory of the structure of medical knowledge and the pragmatics of automatic classification.

  5. 78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-06

    ... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...

  6. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).

    PubMed

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal

    2016-01-01

    This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.

  7. Statistics without Tears: Complex Statistics with Simple Arithmetic

    ERIC Educational Resources Information Center

    Smith, Brian

    2011-01-01

    One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…

  8. Applied statistics in ecology: common pitfalls and simple solutions

    Treesearch

    E. Ashley Steel; Maureen C. Kennedy; Patrick G. Cunningham; John S. Stanovick

    2013-01-01

    The most common statistical pitfalls in ecological research are those associated with data exploration, the logic of sampling and design, and the interpretation of statistical results. Although one can find published errors in calculations, the majority of statistical pitfalls result from incorrect logic or interpretation despite correct numerical calculations. There...

  9. The Statistics of wood assays for preservative retention

    Treesearch

    Patricia K. Lebow; Scott W. Conklin

    2011-01-01

    This paper covers general statistical concepts that apply to interpreting wood assay retention values. In particular, since wood assays are typically obtained from a single composited sample, the statistical aspects, including advantages and disadvantages, of simple compositing are covered.

  10. 2012 aerospace medical certification statistical handbook.

    DOT National Transportation Integrated Search

    2013-12-01

    The annual Aerospace Medical Certification Statistical Handbook reports descriptive : characteristics of all active U.S. civil aviation airmen and the aviation medical examiners (AMEs) that : perform the required medical examinations. The 2012 annual...

  11. Teaching Statistics Using Classic Psychology Research: An Activities-Based Approach

    ERIC Educational Resources Information Center

    Holmes, Karen Y.; Dodd, Brett A.

    2012-01-01

    In this article, we discuss a collection of active learning activities derived from classic psychology studies that illustrate the appropriate use of descriptive and inferential statistics. (Contains 2 tables.)

  12. Introduction

    USDA-ARS?s Scientific Manuscript database

    The introduction to the second edition of the Compendium of Apple and Pear Diseases contains a general description of genus and species of commercial importance, some general information about growth and fruiting habits as well as recent production statistics. A general description of major scion c...

  13. Estimating trends in the global mean temperature record

    NASA Astrophysics Data System (ADS)

    Poppick, Andrew; Moyer, Elisabeth J.; Stein, Michael L.

    2017-06-01

    Given uncertainties in physical theory and numerical climate simulations, the historical temperature record is often used as a source of empirical information about climate change. Many historical trend analyses appear to de-emphasize physical and statistical assumptions: examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for internal variability in nonparametric rather than parametric ways. However, given a limited data record and the presence of internal variability, estimating radiatively forced temperature trends in the historical record necessarily requires some assumptions. Ostensibly empirical methods can also involve an inherent conflict in assumptions: they require data records that are short enough for naive trend models to be applicable, but long enough for long-timescale internal variability to be accounted for. In the context of global mean temperatures, empirical methods that appear to de-emphasize assumptions can therefore produce misleading inferences, because the trend over the twentieth century is complex and the scale of temporal correlation is long relative to the length of the data record. We illustrate here how a simple but physically motivated trend model can provide better-fitting and more broadly applicable trend estimates and can allow for a wider array of questions to be addressed. In particular, the model allows one to distinguish, within a single statistical framework, between uncertainties in the shorter-term vs. longer-term response to radiative forcing, with implications not only on historical trends but also on uncertainties in future projections. We also investigate the consequence on inferred uncertainties of the choice of a statistical description of internal variability. While nonparametric methods may seem to avoid making explicit assumptions, we demonstrate how even misspecified parametric statistical methods, if attuned to the important characteristics of internal variability, can result in more accurate uncertainty statements about trends.

  14. Virtual learning object and environment: a concept analysis.

    PubMed

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  15. Prediction of drug transport processes using simple parameters and PLS statistics. The use of ACD/logP and ACD/ChemSketch descriptors.

    PubMed

    Osterberg, T; Norinder, U

    2001-01-01

    A method of modelling and predicting biopharmaceutical properties using simple theoretically computed molecular descriptors and multivariate statistics has been investigated for several data sets related to solubility, IAM chromatography, permeability across Caco-2 cell monolayers, human intestinal perfusion, brain-blood partitioning, and P-glycoprotein ATPase activity. The molecular descriptors (e.g. molar refractivity, molar volume, index of refraction, surface tension and density) and logP were computed with ACD/ChemSketch and ACD/logP, respectively. Good statistical models were derived that permit simple computational prediction of biopharmaceutical properties. All final models derived had R(2) values ranging from 0.73 to 0.95 and Q(2) values ranging from 0.69 to 0.86. The RMSEP values for the external test sets ranged from 0.24 to 0.85 (log scale).

  16. Classical Mathematical Models for Description and Prediction of Experimental Tumor Growth

    PubMed Central

    Benzekry, Sébastien; Lamont, Clare; Beheshti, Afshin; Tracz, Amanda; Ebos, John M. L.; Hlatky, Lynn; Hahnfeldt, Philip

    2014-01-01

    Despite internal complexity, tumor growth kinetics follow relatively simple laws that can be expressed as mathematical models. To explore this further, quantitative analysis of the most classical of these were performed. The models were assessed against data from two in vivo experimental systems: an ectopic syngeneic tumor (Lewis lung carcinoma) and an orthotopically xenografted human breast carcinoma. The goals were threefold: 1) to determine a statistical model for description of the measurement error, 2) to establish the descriptive power of each model, using several goodness-of-fit metrics and a study of parametric identifiability, and 3) to assess the models' ability to forecast future tumor growth. The models included in the study comprised the exponential, exponential-linear, power law, Gompertz, logistic, generalized logistic, von Bertalanffy and a model with dynamic carrying capacity. For the breast data, the dynamics were best captured by the Gompertz and exponential-linear models. The latter also exhibited the highest predictive power, with excellent prediction scores (≥80%) extending out as far as 12 days in the future. For the lung data, the Gompertz and power law models provided the most parsimonious and parametrically identifiable description. However, not one of the models was able to achieve a substantial prediction rate (≥70%) beyond the next day data point. In this context, adjunction of a priori information on the parameter distribution led to considerable improvement. For instance, forecast success rates went from 14.9% to 62.7% when using the power law model to predict the full future tumor growth curves, using just three data points. These results not only have important implications for biological theories of tumor growth and the use of mathematical modeling in preclinical anti-cancer drug investigations, but also may assist in defining how mathematical models could serve as potential prognostic tools in the clinic. PMID:25167199

  17. Classical mathematical models for description and prediction of experimental tumor growth.

    PubMed

    Benzekry, Sébastien; Lamont, Clare; Beheshti, Afshin; Tracz, Amanda; Ebos, John M L; Hlatky, Lynn; Hahnfeldt, Philip

    2014-08-01

    Despite internal complexity, tumor growth kinetics follow relatively simple laws that can be expressed as mathematical models. To explore this further, quantitative analysis of the most classical of these were performed. The models were assessed against data from two in vivo experimental systems: an ectopic syngeneic tumor (Lewis lung carcinoma) and an orthotopically xenografted human breast carcinoma. The goals were threefold: 1) to determine a statistical model for description of the measurement error, 2) to establish the descriptive power of each model, using several goodness-of-fit metrics and a study of parametric identifiability, and 3) to assess the models' ability to forecast future tumor growth. The models included in the study comprised the exponential, exponential-linear, power law, Gompertz, logistic, generalized logistic, von Bertalanffy and a model with dynamic carrying capacity. For the breast data, the dynamics were best captured by the Gompertz and exponential-linear models. The latter also exhibited the highest predictive power, with excellent prediction scores (≥80%) extending out as far as 12 days in the future. For the lung data, the Gompertz and power law models provided the most parsimonious and parametrically identifiable description. However, not one of the models was able to achieve a substantial prediction rate (≥70%) beyond the next day data point. In this context, adjunction of a priori information on the parameter distribution led to considerable improvement. For instance, forecast success rates went from 14.9% to 62.7% when using the power law model to predict the full future tumor growth curves, using just three data points. These results not only have important implications for biological theories of tumor growth and the use of mathematical modeling in preclinical anti-cancer drug investigations, but also may assist in defining how mathematical models could serve as potential prognostic tools in the clinic.

  18. Linear models for calculating digestibile energy for sheep diets.

    PubMed

    Fonnesbeck, P V; Christiansen, M L; Harris, L E

    1981-05-01

    Equations for estimating the digestible energy (DE) content of sheep diets were generated from the chemical contents and a factorial description of diets fed to lambs in digestion trials. The diet factors were two forages (alfalfa and grass hay), harvested at three stages of maturity (late vegetative, early bloom and full bloom), fed in two ingredient combinations (all hay or a 50:50 hay and corn grain mixture) and prepared by two forage texture processes (coarsely chopped or finely chopped and pelleted). The 2 x 3 x 2 x 2 factorial arrangement produced 24 diet treatments. These were replicated twice, for a total of 48 lamb digestion trials. In model 1 regression equations, DE was calculated directly from chemical composition of the diet. In model 2, regression equations predicted the percentage of digested nutrient from the chemical contents of the diet and then DE of the diet was calculated as the sum of the gross energy of the digested organic components. Expanded forms of model 1 and model 2 were also developed that included diet factors as qualitative indicator variables to adjust the regression constant and regression coefficients for the diet description. The expanded forms of the equations accounted for significantly more variation in DE than did the simple models and more accurately estimated DE of the diet. Information provided by the diet description proved as useful as chemical analyses for the prediction of digestibility of nutrients. The statistics indicate that, with model 1, neutral detergent fiber and plant cell wall analyses provided as much information for the estimation of DE as did model 2 with the combined information from crude protein, available carbohydrate, total lipid, cellulose and hemicellulose. Regression equations are presented for estimating DE with the most currently analyzed organic components, including linear and curvilinear variables and diet factors that significantly reduce the standard error of the estimate. To estimate De of a diet, the user utilizes the equation that uses the chemical analysis information and diet description most effectively.

  19. 2011 aerospace medical certification statistical handbook.

    DOT National Transportation Integrated Search

    2013-01-01

    The annual Aerospace Medical Certification Statistical Handbook reports descriptive characteristics of all active U.S. civil aviation airmen and the aviation medical examiners (AMEs) that perform the required medical examinations. The 2011 annual han...

  20. Large truck crash facts 2005

    DOT National Transportation Integrated Search

    2007-02-01

    This annual edition of Large Truck Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks in 2005. Selected crash statistics on passenger vehicles are also presented for comparison pur...

  1. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    PubMed

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  2. Antecedents to Organizational Performance: Theoretical and Practical Implications for Aircraft Maintenance Officer Force Development

    DTIC Science & Technology

    2015-03-26

    to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61  Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff

  3. Using Facebook Data to Turn Introductory Statistics Students into Consultants

    ERIC Educational Resources Information Center

    Childers, Adam F.

    2017-01-01

    Facebook provides businesses and organizations with copious data that describe how users are interacting with their page. This data affords an excellent opportunity to turn introductory statistics students into consultants to analyze the Facebook data using descriptive and inferential statistics. This paper details a semester-long project that…

  4. ALISE Library and Information Science Education Statistical Report, 1999.

    ERIC Educational Resources Information Center

    Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.

    This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by…

  5. Anxiety (Low Ago Strength) And Intelligence Among Students Of High School Mathematics

    NASA Astrophysics Data System (ADS)

    Naderi, Habibollah

    2008-01-01

    The aim of this study was to investigate the relationship between anxiety (low ago strength) and intelligence among student's mathematics. All the effects of anxiety were studied within the sample of 112 subjects (boys). 56 of them were regular of students (RS) and 56 were intelligent of students (IS) of high schools. Mean age was (17.1 years), SD (.454) and range age was 16-18 years in 3 classes of regular of high school mathematics was for regular students. For the IS, mean age was (16.75 years), SD (.436) and range age was l6-17 years in 4 classes of students exceptional talent for high school mathematics. The sampling method in this study was the simple randomization method. In this studied, for analysis of method used both descriptive and inference of research, which for description of analysis used Average and analysis of covariance and Variance, also for inference of analysis, used with t-test between two the groups of students. The Cattell of Anxiety Test (1958) (CTAT) has been used in a number of studies for measurement trait anxiety in Iran. In general, the findings were found not statistical significant between the RS and the IS of students in that factorial of low of ago strength (C-). Further research is needed to investigate whether the current findings hold for student populations by others anxiety tests.

  6. Injection safety practices in a main referral hospital in Northeastern Nigeria.

    PubMed

    Gadzama, G B; Bawa, S B; Ajinoma, Z; Saidu, M M; Umar, A S

    2014-01-01

    No adherence of safe injection policies remains a major challenge, and, worldwide, annually, it leads to 21 million new hepatitis B cases and 260,000 HIV infection cases. This descriptive observational survey was conducted to determine the level of adherence to universal precaution for safe injection practices in the hospital. The study units were selected using a simple random sampling of injection services provider/phlebotomist in 27 units/wards of the hospital. The study instruments were observation checklist and interviewer administered questionnaires. EPI info (version 3.5.2) software was used for data entry and generation of descriptive statistics was done with units of analysis (units/wards) on injection safety practices of health workers, availability of logistics and supplies, and disposal methods. Only 33.3% of the units (95% CI, 16-54) had non-sharps infectious healthcare waste of any type inside containers specific for non-sharps infectious waste and 17 (77.3%) of the observed therapeutic injections were prepared on a clean, dedicated table or tray, where contamination of the equipment with blood, body fluids, or dirty swabs was unlikely. Absence of recapping of needles was observed in 11 (50.0%) units giving therapeutic injections. Only 7.4% of units surveyed had separate waste containers for infectious non-sharps. This study depicts poor knowledge and a practice of injection safety, inadequate injection safety supplies, and non-compliance to injection safety policy and guidelines.

  7. On the role of fluctuations in the modeling of complex systems.

    NASA Astrophysics Data System (ADS)

    Droz, Michel; Pekalski, Andrzej

    2016-09-01

    The study of models is ubiquitous in sciences like physics, chemistry, ecology, biology or sociology. Models are used to explain experimental facts or to make new predictions. For any system, one can distinguish several levels of description. In the simplest mean-field like description the dynamics is described in terms of spatially averaged quantities while in a microscopic approach local properties are taken into account and local fluctuations for the relevant variables are present. The properties predicted by these two different approaches may be drastically different. In a large body of research literature concerning complex systems this problem is often overlooked and simple mean-field like approximation are used without asking the question of the robustness of the corresponding predictions. The goal of this paper is twofold, first to illustrate the importance of the fluctuations in a self-contained and pedagogical way, by revisiting two different classes of problems where thorough investigations have been conducted (equilibrium and non-equilibrium statistical physics). Second, we present our original research on the dynamics of population of annual plants which are competing among themselves for just one resource (water) through a stochastic dynamics. Depending on the observable considered, the mean-field like and microscopic approaches agree or totally disagree. There is not a general criterion allowing to decide a priori when the two approaches will agree.

  8. Developing stochastic model of thrust and flight dynamics for small UAVs

    NASA Astrophysics Data System (ADS)

    Tjhai, Chandra

    This thesis presents a stochastic thrust model and aerodynamic model for small propeller driven UAVs whose power plant is a small electric motor. First a model which relates thrust generated by a small propeller driven electric motor as a function of throttle setting and commanded engine RPM is developed. A perturbation of this model is then used to relate the uncertainty in throttle and engine RPM commanded to the error in the predicted thrust. Such a stochastic model is indispensable in the design of state estimation and control systems for UAVs where the performance requirements of the systems are specied in stochastic terms. It is shown that thrust prediction models for small UAVs are not a simple, explicit functions relating throttle input and RPM command to thrust generated. Rather they are non-linear, iterative procedures which depend on a geometric description of the propeller and mathematical model of the motor. A detailed derivation of the iterative procedure is presented and the impact of errors which arise from inaccurate propeller and motor descriptions are discussed. Validation results from a series of wind tunnel tests are presented. The results show a favorable statistical agreement between the thrust uncertainty predicted by the model and the errors measured in the wind tunnel. The uncertainty model of aircraft aerodynamic coefficients developed based on wind tunnel experiment will be discussed at the end of this thesis.

  9. 50 CFR Figure 1 to Part 679 - Bering Sea and Aleutian Islands Statistical and Reporting Areas

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Statistical and Reporting Areas 1 Figure 1 to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND... Islands Statistical and Reporting Areas ER15NO99.000 b. Coordinates Code Description 300 Russian waters... statistical area is the part of a reporting area contained in the EEZ. [64 FR 61983, Nov. 15, 1999; 65 FR...

  10. 50 CFR Figure 1 to Part 679 - Bering Sea and Aleutian Islands Statistical and Reporting Areas

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Statistical and Reporting Areas 1 Figure 1 to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND... Islands Statistical and Reporting Areas ER15NO99.000 b. Coordinates Code Description 300 Russian waters... statistical area is the part of a reporting area contained in the EEZ. [64 FR 61983, Nov. 15, 1999; 65 FR...

  11. Simple Benchmark Specifications for Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Singleterry, Robert C. Jr.; Aghara, Sukesh K.

    2013-01-01

    This report defines space radiation benchmark specifications. This specification starts with simple, monoenergetic, mono-directional particles on slabs and progresses to human models in spacecraft. This report specifies the models and sources needed to what the team performing the benchmark needs to produce in a report. Also included are brief descriptions of how OLTARIS, the NASA Langley website for space radiation analysis, performs its analysis.

  12. Designing and maintaining an effective chargemaster.

    PubMed

    Abbey, D C

    2001-03-01

    The chargemaster is the central repository of charges and associated coding information used to develop claims. But this simple description belies the chargemaster's true complexity. The chargemaster's role in the coding process differs from department to department, and not all codes provided on a claim form are necessarily included in the chargemaster, as codes for complex services may need to be developed and reviewed by coding staff. In addition, with the rise of managed care, the chargemaster increasingly is being used to track utilization of supplies and services. To ensure that the chargemaster performs all of its functions effectively, hospitals should appoint a chargemaster coordinator, supported by a chargemaster review team, to oversee the design and maintenance of the chargemaster. Important design issues that should be considered include the principle of "form follows function," static versus dynamic coding, how modifiers should be treated, how charges should be developed, how to incorporate physician fee schedules into the chargemaster, the interface between the chargemaster and cost reports, and how to include statistical information for tracking utilization.

  13. Exploiting Fast-Variables to Understand Population Dynamics and Evolution

    NASA Astrophysics Data System (ADS)

    Constable, George W. A.; McKane, Alan J.

    2018-07-01

    We describe a continuous-time modelling framework for biological population dynamics that accounts for demographic noise. In the spirit of the methodology used by statistical physicists, transitions between the states of the system are caused by individual events while the dynamics are described in terms of the time-evolution of a probability density function. In general, the application of the diffusion approximation still leaves a description that is quite complex. However, in many biological applications one or more of the processes happen slowly relative to the system's other processes, and the dynamics can be approximated as occurring within a slow low-dimensional subspace. We review these time-scale separation arguments and analyse the more simple stochastic dynamics that result in a number of cases. We stress that it is important to retain the demographic noise derived in this way, and emphasise this point by showing that it can alter the direction of selection compared to the prediction made from an analysis of the corresponding deterministic model.

  14. Sources of drinking water in a pediatric population.

    PubMed

    Jadav, Urvi G; Acharya, Bhavini S; Velasquez, Gisela M; Vance, Bradley J; Tate, Robert H; Quock, Ryan L

    2014-01-01

    The purpose of this study was to determine the primary sources of water used for consumption and cooking by the patients of a university-based pediatric dental practice. A simple, prewritten questionnaire-consisting of seven questions and available in English and Spanish-was conducted verbally with the caregivers of 123 pediatric patients during a designated timeframe. Analysis of responses included descriptive statistics and a chi-square test for a single proportion. Nonfiltered tap water accounted for the primary drinking water source in only 10 percent of the respondents. Firty-two percent of the respondents selected bottled water as the primary source of drinking water, and 24 percent selected vended water stations as a primary drinking water source. Nonfiltered tap water was much more likely to be utilized in cooking (58 percent). The majority of the patients in this study's pediatric dental practice do not consume fluoridated tap water. With the vast majority of the patients primarily consuming bottled or vended water, these patients are likely missing out on the caries-protective effects of water fluoridation.

  15. An Orbital Meteoroid Stream Survey Using the Southern Argentina Agile Meteor Radar (SAAMER) Based on a Wavelet Approach

    NASA Technical Reports Server (NTRS)

    Pokorny, P.; Janches, D.; Brown, P. G.; Hormaechea, J. L.

    2017-01-01

    Over a million individually measured meteoroid orbits were collected with the Southern Argentina Agile MEteor Radar (SAAMER) between 2012-2015. This provides a robust statistical database to perform an initial orbital survey of meteor showers in the Southern Hemisphere via the application of a 3D wavelet transform. The method results in a composite year from all 4 years of data, enabling us to obtain an undisturbed year of meteor activity with more than one thousand meteors per day. Our automated meteor shower search methodology identified 58 showers. Of these showers, 24 were associated with previously reported showers from the IAU catalogue while 34 showers are new and not listed in the catalogue. Our searching method combined with our large data sample provides unprecedented accuracy in measuring meteor shower activity and description of shower characteristics in the Southern Hemisphere. Using simple modeling and clustering methods we also propose potential parent bodies for the newly discovered showers.

  16. An orbital meteoroid stream survey using the Southern Argentina Agile MEteor Radar (SAAMER) based on a wavelet approach

    NASA Astrophysics Data System (ADS)

    Pokorný, P.; Janches, D.; Brown, P. G.; Hormaechea, J. L.

    2017-07-01

    Over a million individually measured meteoroid orbits were collected with the Southern Argentina Agile MEteor Radar (SAAMER) between 2012-2015. This provides a robust statistical database to perform an initial orbital survey of meteor showers in the Southern Hemisphere via the application of a 3D wavelet transform. The method results in a composite year from all 4 years of data, enabling us to obtain an undisturbed year of meteor activity with more than one thousand meteors per day. Our automated meteor shower search methodology identified 58 showers. Of these showers, 24 were associated with previously reported showers from the IAU catalogue while 34 showers are new and not listed in the catalogue. Our searching method combined with our large data sample provides unprecedented accuracy in measuring meteor shower activity and description of shower characteristics in the Southern Hemisphere. Using simple modeling and clustering methods we also propose potential parent bodies for the newly discovered showers.

  17. Estimating Cosmic-Ray Spectral Parameters from Simulated Detector Responses with Detector Design Implications

    NASA Technical Reports Server (NTRS)

    Howell, L. W.

    2001-01-01

    A simple power law model consisting of a single spectral index (alpha-1) is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV, with a transition at knee energy (E(sub k)) to a steeper spectral index alpha-2 > alpha-1 above E(sub k). The maximum likelihood procedure is developed for estimating these three spectral parameters of the broken power law energy spectrum from simulated detector responses. These estimates and their surrounding statistical uncertainty are being used to derive the requirements in energy resolution, calorimeter size, and energy response of a proposed sampling calorimeter for the Advanced Cosmic-ray Composition Experiment for the Space Station (ACCESS). This study thereby permits instrument developers to make important trade studies in design parameters as a function of the science objectives, which is particularly important for space-based detectors where physical parameters, such as dimension and weight, impose rigorous practical limits to the design envelope.

  18. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

  19. The Use of Correctional “NO!” Approach to Reduce Destructive Behavior on Autism Student of CANDA Educational Institution in Surakarta

    NASA Astrophysics Data System (ADS)

    Anggraini, N.

    2017-02-01

    This research aims to reduce the destructive behavior such as throwing the learning materials on autism student by using correctional “NO!” approach in CANDA educational institution Surakarta. This research uses Single Subject Research (SSR) method with A-B design, it is baseline and intervention. Subject of this research is one autism student of CANDA educational institution named G.A.P. Data were collected through recording in direct observation in the form of recording events at the time of implementation baseline and intervention. Data were analyzed by simple descriptive statistical analysis and is displayed in graphical form. Based on the result of data analysis, it could be concluded that destructive behavior such as throwing the learning material on autism student was significantly reduced after given an intervention. Based on the research results, using correctional “NO!” approach can be used by teacher or therapist to reduce the destructive behavior on autism student.

  20. Exploiting Fast-Variables to Understand Population Dynamics and Evolution

    NASA Astrophysics Data System (ADS)

    Constable, George W. A.; McKane, Alan J.

    2017-11-01

    We describe a continuous-time modelling framework for biological population dynamics that accounts for demographic noise. In the spirit of the methodology used by statistical physicists, transitions between the states of the system are caused by individual events while the dynamics are described in terms of the time-evolution of a probability density function. In general, the application of the diffusion approximation still leaves a description that is quite complex. However, in many biological applications one or more of the processes happen slowly relative to the system's other processes, and the dynamics can be approximated as occurring within a slow low-dimensional subspace. We review these time-scale separation arguments and analyse the more simple stochastic dynamics that result in a number of cases. We stress that it is important to retain the demographic noise derived in this way, and emphasise this point by showing that it can alter the direction of selection compared to the prediction made from an analysis of the corresponding deterministic model.

  1. Energy Weighted Angular Correlations Between Hadrons Produced in Electron-Positron Annihilation.

    NASA Astrophysics Data System (ADS)

    Strharsky, Roger Joseph

    Electron-positron annihilation at large center of mass energy produces many hadronic particles. Experimentalists then measure the energies of these particles in calorimeters. This study investigated correlations between the angular locations of one or two such calorimeters and the angular orientation of the electron beam in the laboratory frame of reference. The calculation of these correlations includes weighting by the fraction of the total center of mass energy which the calorimeter measures. Starting with the assumption that the reaction proceeeds through the intermediate production of a single quark/anti-quark pair, a simple statistical model was developed to provide a phenomenological description of the distribution of final state hadrons. The model distributions were then used to calculate the one- and two-calorimeter correlation functions. Results of these calculations were compared with available data and several predictions were made for those quantities which had not yet been measured. Failure of the model to reproduce all of the data was discussed in terms of quantum chromodynamics, a fundamental theory which includes quark interactions.

  2. Extension of the BMCSL equation of state for hard spheres to the metastable disordered region: Application to the SAFT approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paricaud, P.

    2015-07-28

    A simple modification of the Boublík-Mansoori-Carnahan-Starling-Leland equation of state is proposed for an application to the metastable disordered region. The new model has a positive pole at the jamming limit and can accurately describe the molecular simulation data of pure hard in the stable fluid region and along the metastable branch. The new model has also been applied to binary mixtures hard spheres, and an excellent description of the fluid and metastable branches can be obtained by adjusting the jamming packing fraction. The new model for hard sphere mixtures can be used as the repulsive term of equations of statemore » for real fluids. In this case, the modified equations of state give very similar predictions of thermodynamic properties as the original models, and one can remove the multiple liquid density roots observed for some versions of the Statistical Associating Fluid Theory (SAFT) at low temperature without any modification of the dispersion term.« less

  3. Insufficient Knowledge of Breast Cancer Risk Factors Among Malaysian Female University Students

    PubMed Central

    Samah, Asnarulkhadi Abu; Ahmadian, Maryam; Latiff, Latiffah A.

    2016-01-01

    Background: Despite continuous argument about the efficacy of breast self-examination; it still could be a life-saving technique through inspiring and empowering women to take better control over their body/breast and health. This study investigated Malaysian female university students’ knowledge about breast cancer risk factors, signs, and symptoms and assessed breast self-examination frequency among students. Method: A cross-sectional survey was conducted in 2013 in nine public and private universities in the Klang Valley and Selangor. 842 female students were respondents for the self-administered survey technique. Simple descriptive and inferential statistics were employed for data analysis. Results: The uptake of breast self-examination (BSE) was less than 50% among the students. Most of students had insufficient knowledge on several breast cancer risk factors. Conclusion: Actions and efforts should be done to increase knowledge of breast cancer through the development of ethnically and traditionally sensitive educational training on BSE and breast cancer literacy. PMID:26234996

  4. A pairwise maximum entropy model accurately describes resting-state human brain networks

    PubMed Central

    Watanabe, Takamitsu; Hirose, Satoshi; Wada, Hiroyuki; Imai, Yoshio; Machida, Toru; Shirouzu, Ichiro; Konishi, Seiki; Miyashita, Yasushi; Masuda, Naoki

    2013-01-01

    The resting-state human brain networks underlie fundamental cognitive functions and consist of complex interactions among brain regions. However, the level of complexity of the resting-state networks has not been quantified, which has prevented comprehensive descriptions of the brain activity as an integrative system. Here, we address this issue by demonstrating that a pairwise maximum entropy model, which takes into account region-specific activity rates and pairwise interactions, can be robustly and accurately fitted to resting-state human brain activities obtained by functional magnetic resonance imaging. Furthermore, to validate the approximation of the resting-state networks by the pairwise maximum entropy model, we show that the functional interactions estimated by the pairwise maximum entropy model reflect anatomical connexions more accurately than the conventional functional connectivity method. These findings indicate that a relatively simple statistical model not only captures the structure of the resting-state networks but also provides a possible method to derive physiological information about various large-scale brain networks. PMID:23340410

  5. A maximum entropy thermodynamics of small systems.

    PubMed

    Dixit, Purushottam D

    2013-05-14

    We present a maximum entropy approach to analyze the state space of a small system in contact with a large bath, e.g., a solvated macromolecular system. For the solute, the fluctuations around the mean values of observables are not negligible and the probability distribution P(r) of the state space depends on the intricate details of the interaction of the solute with the solvent. Here, we employ a superstatistical approach: P(r) is expressed as a marginal distribution summed over the variation in β, the inverse temperature of the solute. The joint distribution P(β, r) is estimated by maximizing its entropy. We also calculate the first order system-size corrections to the canonical ensemble description of the state space. We test the development on a simple harmonic oscillator interacting with two baths with very different chemical identities, viz., (a) Lennard-Jones particles and (b) water molecules. In both cases, our method captures the state space of the oscillator sufficiently well. Future directions and connections with traditional statistical mechanics are discussed.

  6. Association of Physical Exercise on Anxiety and Depression Amongst Adults.

    PubMed

    Khanzada, Faizan Jameel; Soomro, Nabila; Khan, Shahidda Zakir

    2015-07-01

    This study was done to determine the frequency of anxiety, depression among those who exercise regularly and those who do not. Across-sectional study was conducted at different gymnasiums of Karachi in July-August 2013. A total 269 individual's ages were 18 - 45 years completed a self-administered questionnaire to assess the data using simple descriptive statistics. One hundred and thirty four individuals were those who did not perform exercise which included females (55.0%) being more frequently anxious than male (46.4%). Females (39.9%) were more frequently depressed as compared to males (26.4%) less depressed. Chi-square test showed association between anxiety levels and exercise was significantly increased in non-exercisers compared to regular exercisers found to be significant (p=0.015). Individuals who performed regular exercise had a lower frequency of depression (28.9%) than non-exercisers (41.8%). Physical exercise was significantly associated with lower anxiety and depression frequency amongst the studied adult population.

  7. Active Brownian Particles. From Individual to Collective Stochastic Dynamics

    NASA Astrophysics Data System (ADS)

    Romanczuk, P.; Bär, M.; Ebeling, W.; Lindner, B.; Schimansky-Geier, L.

    2012-03-01

    We review theoretical models of individual motility as well as collective dynamics and pattern formation of active particles. We focus on simple models of active dynamics with a particular emphasis on nonlinear and stochastic dynamics of such self-propelled entities in the framework of statistical mechanics. Examples of such active units in complex physico-chemical and biological systems are chemically powered nano-rods, localized patterns in reaction-diffusion system, motile cells or macroscopic animals. Based on the description of individual motion of point-like active particles by stochastic differential equations, we discuss different velocity-dependent friction functions, the impact of various types of fluctuations and calculate characteristic observables such as stationary velocity distributions or diffusion coefficients. Finally, we consider not only the free and confined individual active dynamics but also different types of interaction between active particles. The resulting collective dynamical behavior of large assemblies and aggregates of active units is discussed and an overview over some recent results on spatiotemporal pattern formation in such systems is given.

  8. Unemployment, Parental Distress and Youth Emotional Well-Being: The Moderation Roles of Parent-Youth Relationship and Financial Deprivation.

    PubMed

    Frasquilho, Diana; de Matos, Margarida Gaspar; Marques, Adilson; Neville, Fergus G; Gaspar, Tânia; Caldas-de-Almeida, J M

    2016-10-01

    We investigated, in a sample of 112 unemployed parents of adolescents aged 10-19 years, the links between parental distress and change in youth emotional problems related to parental unemployment, and the moderation roles of parent-youth relationship and financial deprivation. Data were analyzed using descriptive statistics and correlations. Further, simple moderation, additive moderation, and moderated moderation models of regression were performed to analyze the effects of parental distress, parent-youth relationship and financial deprivation in predicting change in youth emotional problems related to parental unemployment. Results show that parental distress moderated by parent-youth relationship predicted levels of change in youth emotional problems related to parental unemployment. This study provides evidence that during job loss, parental distress is linked to youth emotional well-being and that parent-youth relationships play an important moderation role. This raises the importance of further researching parental distress impacts on youth well-being, especially during periods of high unemployment rates.

  9. The Performance of Preparatory School Candidates at the United States Naval Academy

    DTIC Science & Technology

    2001-09-01

    79 1. Differences in Characteristics .....................................................79 2. Differences in...Coefficients ......................................42 Table 3.3 Applicant/Midshipman Background Characteristics ...45 Table 3.4 Descriptive Characteristics for Midshipmen by Accession Source .................46 Table 3.5 Descriptive Statistics for

  10. A Role for Chunk Formation in Statistical Learning of Second Language Syntax

    ERIC Educational Resources Information Center

    Hamrick, Phillip

    2014-01-01

    Humans are remarkably sensitive to the statistical structure of language. However, different mechanisms have been proposed to account for such statistical sensitivities. The present study compared adult learning of syntax and the ability of two models of statistical learning to simulate human performance: Simple Recurrent Networks, which learn by…

  11. SIMPL Systems, or: Can We Design Cryptographic Hardware without Secret Key Information?

    NASA Astrophysics Data System (ADS)

    Rührmair, Ulrich

    This paper discusses a new cryptographic primitive termed SIMPL system. Roughly speaking, a SIMPL system is a special type of Physical Unclonable Function (PUF) which possesses a binary description that allows its (slow) public simulation and prediction. Besides this public key like functionality, SIMPL systems have another advantage: No secret information is, or needs to be, contained in SIMPL systems in order to enable cryptographic protocols - neither in the form of a standard binary key, nor as secret information hidden in random, analog features, as it is the case for PUFs. The cryptographic security of SIMPLs instead rests on (i) a physical assumption on their unclonability, and (ii) a computational assumption regarding the complexity of simulating their output. This novel property makes SIMPL systems potentially immune against many known hardware and software attacks, including malware, side channel, invasive, or modeling attacks.

  12. Trends in motor vehicle traffic collision statistics, 1988-1997

    DOT National Transportation Integrated Search

    2001-02-01

    This report presents descriptive statistics about Canadian traffic collisions during the ten-year period : from 1988 to 1997, focusing specifically on casualty collisions. Casualty collisions are defined as all : reportable motor vehicle crashes resu...

  13. 77 FR 10695 - Federal Housing Administration (FHA) Risk Management Initiatives: Revised Seller Concessions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-23

    ... in Tables A and B. Table D--Borrower Closing Costs and Seller Concessions Descriptive Statistics by... accuracy of the statistical data illustrating the correlation between higher seller concessions and an...

  14. 42 CFR 402.7 - Notice of proposed determination.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and a brief description of the statistical sampling technique CMS or OIG used. (3) The reason why the... is relying upon statistical sampling to project the number and types of claims or requests for...

  15. Learning investment indicators through data extension

    NASA Astrophysics Data System (ADS)

    Dvořák, Marek

    2017-07-01

    Stock prices in the form of time series were analysed using single and multivariate statistical methods. After simple data preprocessing in the form of logarithmic differences, we augmented this single variate time series to a multivariate representation. This method makes use of sliding windows to calculate several dozen of new variables using simple statistic tools like first and second moments as well as more complicated statistic, like auto-regression coefficients and residual analysis, followed by an optional quadratic transformation that was further used for data extension. These were used as a explanatory variables in a regularized logistic LASSO regression which tried to estimate Buy-Sell Index (BSI) from real stock market data.

  16. The ambient dose equivalent at flight altitudes: a fit to a large set of data using a Bayesian approach.

    PubMed

    Wissmann, F; Reginatto, M; Möller, T

    2010-09-01

    The problem of finding a simple, generally applicable description of worldwide measured ambient dose equivalent rates at aviation altitudes between 8 and 12 km is difficult to solve due to the large variety of functional forms and parametrisations that are possible. We present an approach that uses Bayesian statistics and Monte Carlo methods to fit mathematical models to a large set of data and to compare the different models. About 2500 data points measured in the periods 1997-1999 and 2003-2006 were used. Since the data cover wide ranges of barometric altitude, vertical cut-off rigidity and phases in the solar cycle 23, we developed functions which depend on these three variables. Whereas the dependence on the vertical cut-off rigidity is described by an exponential, the dependences on barometric altitude and solar activity may be approximated by linear functions in the ranges under consideration. Therefore, a simple Taylor expansion was used to define different models and to investigate the relevance of the different expansion coefficients. With the method presented here, it is possible to obtain probability distributions for each expansion coefficient and thus to extract reliable uncertainties even for the dose rate evaluated. The resulting function agrees well with new measurements made at fixed geographic positions and during long haul flights covering a wide range of latitudes.

  17. A simple and effective solution to the constrained QM/MM simulations

    NASA Astrophysics Data System (ADS)

    Takahashi, Hideaki; Kambe, Hiroyuki; Morita, Akihiro

    2018-04-01

    It is a promising extension of the quantum mechanical/molecular mechanical (QM/MM) approach to incorporate the solvent molecules surrounding the QM solute into the QM region to ensure the adequate description of the electronic polarization of the solute. However, the solvent molecules in the QM region inevitably diffuse into the MM bulk during the QM/MM simulation. In this article, we developed a simple and efficient method, referred to as the "boundary constraint with correction (BCC)," to prevent the diffusion of the solvent water molecules by means of a constraint potential. The point of the BCC method is to compensate the error in a statistical property due to the bias potential by adding a correction term obtained through a set of QM/MM simulations. The BCC method is designed so that the effect of the bias potential completely vanishes when the QM solvent is identical with the MM solvent. Furthermore, the desirable conditions, that is, the continuities of energy and force and the conservations of energy and momentum, are fulfilled in principle. We applied the QM/MM-BCC method to a hydronium ion(H3O+) in aqueous solution to construct the radial distribution function (RDF) of the solvent around the solute. It was demonstrated that the correction term fairly compensated the error and led the RDF in good agreement with the result given by an ab initio molecular dynamics simulation.

  18. The Value of Data and Metadata Standardization for Interoperability in Giovanni

    NASA Astrophysics Data System (ADS)

    Smit, C.; Hegde, M.; Strub, R. F.; Bryant, K.; Li, A.; Petrenko, M.

    2017-12-01

    Giovanni (https://giovanni.gsfc.nasa.gov/giovanni/) is a data exploration and visualization tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC). It has been around in one form or another for more than 15 years. Giovanni calculates simple statistics and produces 22 different visualizations for more than 1600 geophysical parameters from more than 90 satellite and model products. Giovanni relies on external data format standards to ensure interoperability, including the NetCDF CF Metadata Conventions. Unfortunately, these standards were insufficient to make Giovanni's internal data representation truly simple to use. Finding and working with dimensions can be convoluted with the CF Conventions. Furthermore, the CF Conventions are silent on machine-friendly descriptive metadata such as the parameter's source product and product version. In order to simplify analyzing disparate earth science data parameters in a unified way, we developed Giovanni's internal standard. First, the format standardizes parameter dimensions and variables so they can be easily found. Second, the format adds all the machine-friendly metadata Giovanni needs to present our parameters to users in a consistent and clear manner. At a glance, users can grasp all the pertinent information about parameters both during parameter selection and after visualization. This poster gives examples of how our metadata and data standards, both external and internal, have both simplified our code base and improved our users' experiences.

  19. Coupled Particle Transport and Pattern Formation in a Nonlinear Leaky-Box Model

    NASA Technical Reports Server (NTRS)

    Barghouty, A. F.; El-Nemr, K. W.; Baird, J. K.

    2009-01-01

    Effects of particle-particle coupling on particle characteristics in nonlinear leaky-box type descriptions of the acceleration and transport of energetic particles in space plasmas are examined in the framework of a simple two-particle model based on the Fokker-Planck equation in momentum space. In this model, the two particles are assumed coupled via a common nonlinear source term. In analogy with a prototypical mathematical system of diffusion-driven instability, this work demonstrates that steady-state patterns with strong dependence on the magnetic turbulence but a rather weak one on the coupled particles attributes can emerge in solutions of a nonlinearly coupled leaky-box model. The insight gained from this simple model may be of wider use and significance to nonlinearly coupled leaky-box type descriptions in general.

  20. [In defence of the diagnosis of simple schizophrenia: reflections on a case presentation].

    PubMed

    Martínez Serrano, José; Medina Garrido, María L; Consuegra Sánchez, Rosario; Del Cerro Oñate, Matias; López-Mesa, José L; González Matás, Juana

    2012-01-01

    Since the first case descriptions of dementia praecox (Diem, 1903), the diagnosis of simple schizophrenia has continued to be controversial. The questioning of its descriptive validity and its reliability, as well as its infrequent use, has led to it being eliminated as a sub-type of schizophrenia in the DSM-III. Criteria for the diagnosis of «simple deteriorative disorder» are currently included in the DSM-IV-TR as a disorder requiring more studies for its possible inclusion. An attempt is made, using a clinical case, to perform a historical review of the concept of simple schizophrenia, and at the same what has led to the reflection on the possible reasons for the controversy, and a potential route to resolve it. Using a controversial clinical case, which meets ICD-10 clinical criteria for simple schizophrenia (and those of the DSM-IV-TR for the simple deteriorative disorder), we reflect on the symptoms and diagnostic difficulties. A literature review and update on the subjects was also performed. Our patient highlights, by the absence in the clinical picture of the most obvious positive psychotic symptoms, the tendency by psychiatrists to identify the diagnosis of schizophrenia with the presence of the same, at least at some time during its evolution. The use of neuroimaging tests was useful to assess the level of deterioration and prognosis of the patient. Considering simple schizophrenia in the differential diagnosis of other chronic deteriorative disorders could increase its recognition in the initial phases. The use of neuropsychological function tests, and looking for typical deteriorative patterns of the schizophrenia spectrum, could help to increase the reliability of the diagnosis. Copyright © 2011 SEP y SEPB. Published by Elsevier Espana. All rights reserved.

  1. Organizational Commitment DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    commitment construct that targets more specifically on the workgroup frame of reference. Included is a review of the 4.0 description and items...followed by the proposed modifications to the factor. The DEOCS 4.0 description provided for organizational commitment is “members’ dedication to the...5) examining variance and descriptive statistics, and (6) selecting items that demonstrate the strongest scale properties. Table 1. DEOCS 4.0

  2. Description of the Role of Shot Noise in Spectroscopic Absorption and Emission Measurements with Photodiode and Photomultiplier Tube Detectors: Information for an Instrumental Analysis Course

    ERIC Educational Resources Information Center

    McClain, Robert L.; Wright, John C.

    2014-01-01

    A description of shot noise and the role it plays in absorption and emission measurements using photodiode and photomultiplier tube detection systems is presented. This description includes derivations of useful forms of the shot noise equation based on Poisson counting statistics. This approach can deepen student understanding of a fundamental…

  3. Robust Combining of Disparate Classifiers Through Order Statistics

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep

    2001-01-01

    Integrating the outputs of multiple classifiers via combiners or meta-learners has led to substantial improvements in several difficult pattern recognition problems. In this article we investigate a family of combiners based on order statistics, for robust handling of situations where there are large discrepancies in performance of individual classifiers. Based on a mathematical modeling of how the decision boundaries are affected by order statistic combiners, we derive expressions for the reductions in error expected when simple output combination methods based on the the median, the maximum and in general, the ith order statistic, are used. Furthermore, we analyze the trim and spread combiners, both based on linear combinations of the ordered classifier outputs, and show that in the presence of uneven classifier performance, they often provide substantial gains over both linear and simple order statistics combiners. Experimental results on both real world data and standard public domain data sets corroborate these findings.

  4. Counting statistics for genetic switches based on effective interaction approximation

    NASA Astrophysics Data System (ADS)

    Ohkubo, Jun

    2012-09-01

    Applicability of counting statistics for a system with an infinite number of states is investigated. The counting statistics has been studied a lot for a system with a finite number of states. While it is possible to use the scheme in order to count specific transitions in a system with an infinite number of states in principle, we have non-closed equations in general. A simple genetic switch can be described by a master equation with an infinite number of states, and we use the counting statistics in order to count the number of transitions from inactive to active states in the gene. To avoid having the non-closed equations, an effective interaction approximation is employed. As a result, it is shown that the switching problem can be treated as a simple two-state model approximately, which immediately indicates that the switching obeys non-Poisson statistics.

  5. Asymptotic Linear Spectral Statistics for Spiked Hermitian Random Matrices

    NASA Astrophysics Data System (ADS)

    Passemier, Damien; McKay, Matthew R.; Chen, Yang

    2015-07-01

    Using the Coulomb Fluid method, this paper derives central limit theorems (CLTs) for linear spectral statistics of three "spiked" Hermitian random matrix ensembles. These include Johnstone's spiked model (i.e., central Wishart with spiked correlation), non-central Wishart with rank-one non-centrality, and a related class of non-central matrices. For a generic linear statistic, we derive simple and explicit CLT expressions as the matrix dimensions grow large. For all three ensembles under consideration, we find that the primary effect of the spike is to introduce an correction term to the asymptotic mean of the linear spectral statistic, which we characterize with simple formulas. The utility of our proposed framework is demonstrated through application to three different linear statistics problems: the classical likelihood ratio test for a population covariance, the capacity analysis of multi-antenna wireless communication systems with a line-of-sight transmission path, and a classical multiple sample significance testing problem.

  6. Distinguishing Positive Selection From Neutral Evolution: Boosting the Performance of Summary Statistics

    PubMed Central

    Lin, Kao; Li, Haipeng; Schlötterer, Christian; Futschik, Andreas

    2011-01-01

    Summary statistics are widely used in population genetics, but they suffer from the drawback that no simple sufficient summary statistic exists, which captures all information required to distinguish different evolutionary hypotheses. Here, we apply boosting, a recent statistical method that combines simple classification rules to maximize their joint predictive performance. We show that our implementation of boosting has a high power to detect selective sweeps. Demographic events, such as bottlenecks, do not result in a large excess of false positives. A comparison to other neutrality tests shows that our boosting implementation performs well compared to other neutrality tests. Furthermore, we evaluated the relative contribution of different summary statistics to the identification of selection and found that for recent sweeps integrated haplotype homozygosity is very informative whereas older sweeps are better detected by Tajima's π. Overall, Watterson's θ was found to contribute the most information for distinguishing between bottlenecks and selection. PMID:21041556

  7. Orthodontics for the dog. Treatment methods.

    PubMed

    Ross, D L

    1986-09-01

    This article considers the prevention of orthodontic problems, occlusal adjustments, simple tooth movements, rotational techniques, tipping problems, adjustment of crown height, descriptions of common orthodontic appliances, and problems associated with therapy.

  8. Long-term strategy for the statistical design of a forest health monitoring system

    Treesearch

    Hans T. Schreuder; Raymond L. Czaplewski

    1993-01-01

    A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...

  9. Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.

    ERIC Educational Resources Information Center

    Ojeda, Mario Miguel; Sahai, Hardeo

    2002-01-01

    Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…

  10. Computers as an Instrument for Data Analysis. Technical Report No. 11.

    ERIC Educational Resources Information Center

    Muller, Mervin E.

    A review of statistical data analysis involving computers as a multi-dimensional problem provides the perspective for consideration of the use of computers in statistical analysis and the problems associated with large data files. An overall description of STATJOB, a particular system for doing statistical data analysis on a digital computer,…

  11. A Descriptive Study of Individual and Cross-Cultural Differences in Statistics Anxiety

    ERIC Educational Resources Information Center

    Baloglu, Mustafa; Deniz, M. Engin; Kesici, Sahin

    2011-01-01

    The present study investigated individual and cross-cultural differences in statistics anxiety among 223 Turkish and 237 American college students. A 2 x 2 between-subjects factorial multivariate analysis of covariance (MANCOVA) was performed on the six dependent variables which are the six subscales of the Statistical Anxiety Rating Scale.…

  12. Children in the UK: Signposts to Statistics.

    ERIC Educational Resources Information Center

    Grey, Eleanor

    This guide indicates statistical sources in the United Kingdom dealing with children and young people. Regular and occasional sources are listed in a three-column format including the name of the source, a brief description, and the geographic area to which statistics refer. Information is classified under 25 topic headings: abortions; accidents;…

  13. An analysis of the relationship of flight hours and naval rotary wing aviation mishaps

    DTIC Science & Technology

    2017-03-01

    evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically significant effects on...estimates found enough evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically ...38 C. DESCRIPTIVE STATISTICS ................................................................38 D

  14. Practicing Statistics by Creating Exercises for Fellow Students

    ERIC Educational Resources Information Center

    Bebermeier, Sarah; Reiss, Katharina

    2016-01-01

    This article outlines the execution of a workshop in which students were encouraged to actively review the course contents on descriptive statistics by creating exercises for their fellow students. In a first-year statistics course in psychology, 39 out of 155 students participated in the workshop. In a subsequent evaluation, the workshop was…

  15. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)

    PubMed Central

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal

    2016-01-01

    Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365

  16. Algorithm for computing descriptive statistics for very large data sets and the exa-scale era

    NASA Astrophysics Data System (ADS)

    Beekman, Izaak

    2017-11-01

    An algorithm for Single-point, Parallel, Online, Converging Statistics (SPOCS) is presented. It is suited for in situ analysis that traditionally would be relegated to post-processing, and can be used to monitor the statistical convergence and estimate the error/residual in the quantity-useful for uncertainty quantification too. Today, data may be generated at an overwhelming rate by numerical simulations and proliferating sensing apparatuses in experiments and engineering applications. Monitoring descriptive statistics in real time lets costly computations and experiments be gracefully aborted if an error has occurred, and monitoring the level of statistical convergence allows them to be run for the shortest amount of time required to obtain good results. This algorithm extends work by Pébay (Sandia Report SAND2008-6212). Pébay's algorithms are recast into a converging delta formulation, with provably favorable properties. The mean, variance, covariances and arbitrary higher order statistical moments are computed in one pass. The algorithm is tested using Sillero, Jiménez, & Moser's (2013, 2014) publicly available UPM high Reynolds number turbulent boundary layer data set, demonstrating numerical robustness, efficiency and other favorable properties.

  17. Are Statisticians Cold-Blooded Bosses? A New Perspective on the "Old" Concept of Statistical Population

    ERIC Educational Resources Information Center

    Lu, Yonggang; Henning, Kevin S. S.

    2013-01-01

    Spurred by recent writings regarding statistical pragmatism, we propose a simple, practical approach to introducing students to a new style of statistical thinking that models nature through the lens of data-generating processes, not populations. (Contains 5 figures.)

  18. Describing contrast across scales

    NASA Astrophysics Data System (ADS)

    Syed, Sohaib Ali; Iqbal, Muhammad Zafar; Riaz, Muhammad Mohsin

    2017-06-01

    Due to its sensitive nature against illumination and noise distributions, contrast is not widely used for image description. On the contrary, the human perception of contrast along different spatial frequency bandwidths provides a powerful discriminator function that can be modeled in a robust manner against local illumination. Based upon this observation, a dense local contrast descriptor is proposed and its potential in different applications of computer vision is discussed. Extensive experiments reveal that this simple yet effective description performs well in comparison with state of the art image descriptors. We also show the importance of this description in multiresolution pansharpening framework.

  19. Large truck and bus crash facts, 2010.

    DOT National Transportation Integrated Search

    2012-09-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2010. Selected crash statistics on passenger : vehicles are also presen...

  20. Large truck and bus crash facts, 2007.

    DOT National Transportation Integrated Search

    2009-03-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2007. Selected crash statistics on passenger : vehicles are also presen...

  1. Large truck and bus crash facts, 2008. 

    DOT National Transportation Integrated Search

    2010-03-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2008. Selected crash statistics on passenger : vehicles are also presen...

  2. Large truck and bus crash facts, 2011.

    DOT National Transportation Integrated Search

    2013-10-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2011. Selected crash statistics on passenger : vehicles are also presen...

  3. Large truck and bus crash facts, 2013.

    DOT National Transportation Integrated Search

    2015-04-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks and buses in 2013. Selected crash statistics on passenger vehicles are also presented ...

  4. Large truck and bus crash facts, 2009.

    DOT National Transportation Integrated Search

    2011-10-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2009. Selected crash statistics on passenger : vehicles are also presen...

  5. Large truck and bus crash facts, 2012.

    DOT National Transportation Integrated Search

    2014-06-01

    This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks and buses in 2012. Selected crash statistics on passenger vehicles are also presented ...

  6. Realistic finite temperature simulations of magnetic systems using quantum statistics

    NASA Astrophysics Data System (ADS)

    Bergqvist, Lars; Bergman, Anders

    2018-01-01

    We have performed realistic atomistic simulations at finite temperatures using Monte Carlo and atomistic spin dynamics simulations incorporating quantum (Bose-Einstein) statistics. The description is much improved at low temperatures compared to classical (Boltzmann) statistics normally used in these kind of simulations, while at higher temperatures the classical statistics are recovered. This corrected low-temperature description is reflected in both magnetization and the magnetic specific heat, the latter allowing for improved modeling of the magnetic contribution to free energies. A central property in the method is the magnon density of states at finite temperatures, and we have compared several different implementations for obtaining it. The method has no restrictions regarding chemical and magnetic order of the considered materials. This is demonstrated by applying the method to elemental ferromagnetic systems, including Fe and Ni, as well as Fe-Co random alloys and the ferrimagnetic system GdFe3.

  7. Wave cybernetics: A simple model of wave-controlled nonlinear and nonlocal cooperative phenomena

    NASA Astrophysics Data System (ADS)

    Yasue, Kunio

    1988-09-01

    A simple theoretical description of nonlinear and nonlocal cooperative phenomena is presented in which the global control mechanism of the whole system is given by the tuned-wave propagation. It provides us with an interesting universal scheme of systematization in physical and biological systems called wave cybernetics, and may be understood as a model realizing Bohm's idea of implicate order in natural philosophy.

  8. Using a Five-Step Procedure for Inferential Statistical Analyses

    ERIC Educational Resources Information Center

    Kamin, Lawrence F.

    2010-01-01

    Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…

  9. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  10. 47 CFR 1.363 - Introduction of statistical data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...

  11. 47 CFR 1.363 - Introduction of statistical data.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...

  12. 47 CFR 1.363 - Introduction of statistical data.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...

  13. 47 CFR 1.363 - Introduction of statistical data.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...

  14. 47 CFR 1.363 - Introduction of statistical data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...

  15. Trends in statistical methods in articles published in Archives of Plastic Surgery between 2012 and 2017.

    PubMed

    Han, Kyunghwa; Jung, Inkyung

    2018-05-01

    This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.

  16. Charcoal anatomy of Brazilian species. I. Anacardiaceae.

    PubMed

    Gonçalves, Thaís A P; Scheel-Ybert, Rita

    2016-01-01

    Anthracological studies are firmly advancing in the tropics during the last decades. The theoretical and methodological bases of the discipline are well established. Yet, there is a strong demand for comparative reference material, seeking for an improvement in the precision of taxonomic determination, both in palaeoecological and palaeoethnobotanical studies and to help preventing illegal charcoal production. This work presents descriptions of charcoal anatomy of eleven Anacardiaceae species from six genera native to Brazil (Anacardium occidentale, Anacardium parvifolium, Astronium graveolens, Astronium lecointei, Lithrea molleoides, Schinus terebenthifolius, Spondias mombin, Spondias purpurea, Spondias tuberosa, Tapirira guianensis, and Tapirira obtusa). They are characterized by diffuse-porous wood, vessels solitary and in multiples, tyloses and spiral thickenings sometimes present; simple perforation plates, alternate intervessel pits, rounded vessel-ray pits with much reduced borders to apparently simple; parenchyma paratracheal scanty to vasicentric; heterocellular rays, some with radial canals and crystals; septate fibres with simple pits. These results are quite similar to previous wood anatomical descriptions of the same species or genera. Yet, charcoal identification is more effective when unknown samples are compared to charred extant equivalents, instead of to wood slides.

  17. Statistics Using Just One Formula

    ERIC Educational Resources Information Center

    Rosenthal, Jeffrey S.

    2018-01-01

    This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…

  18. Three lessons for genetic toxicology from baseball analytics.

    PubMed

    Dertinger, Stephen D

    2017-07-01

    In many respects the evolution of baseball statistics mirrors advances made in the field of genetic toxicology. From its inception, baseball and statistics have been inextricably linked. Generations of players and fans have used a number of relatively simple measurements to describe team and individual player's current performance, as well as for historical record-keeping purposes. Over the years, baseball analytics has progressed in several important ways. Early advances were based on deriving more meaningful metrics from simpler forerunners. Now, technological innovations are delivering much deeper insights. Videography, radar, and other advances that include automatic player recognition capabilities provide the means to measure more complex and useful factors. Fielders' reaction times, efficiency of the route taken to reach a batted ball, and pitch-framing effectiveness come to mind. With the current availability of complex measurements from multiple data streams, multifactorial analyses occurring via machine learning algorithms have become necessary to make sense of the terabytes of data that are now being captured in every Major League Baseball game. Collectively, these advances have transformed baseball statistics from being largely descriptive in nature to serving data-driven, predictive roles. Whereas genetic toxicology has charted a somewhat parallel course, a case can be made that greater utilization of baseball's mindset and strategies would serve our scientific field well. This paper describes three useful lessons for genetic toxicology, courtesy of the field of baseball analytics: seek objective knowledge; incorporate multiple data streams; and embrace machine learning. Environ. Mol. Mutagen. 58:390-397, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  19. Decisional conflict in asthma patients: a cross sectional study.

    PubMed

    Des Cormiers, Annick; Légaré, France; Simard, Serge; Boulet, Louis-Philippe

    2015-01-01

    This study aimed at determining the level of decisional conflict in asthmatic individuals facing recommendation-based decisions provided to improve asthma control. This was a cross-sectional study performed on a convenience sample of 50 adults aged between 18 and 65 years with a diagnosis of asthma. They completed a decisional conflict scale (possible range of 0-100%), asthma knowledge and control questionnaires (both 0% and 100%), and a general questionnaire on socio-demographic characteristics. A decisional conflict was considered clinically significant with a score greater than 37.5%. Simple descriptive statistics were used to investigate associations with decisional conflict. Participants were mainly women (76%) and diagnosed with mild asthma (72%). The median age (1st and 3rd quartile) was 25 years (22 and 42). The median score (1st and 3rd quartile) of decisional conflict was 33% (24 and 44). A clinically significant score (>37.5%) was obtained in 36% of subjects. A statistically significant negative correlation between the knowledge score and the decisional conflict score (r(p) = -0.38; p = 0.006) was observed. The level of knowledge was the only statistically independent variable associated with the decisional conflict score (p = 0.0043). A considerable proportion of patients with asthma have a clinically significant level of decisional conflict when facing decisions aimed at improving asthma control. Patients with poor knowledge of asthma are more at risk of clinically significant level of decisional conflict. These findings support the relevance of providing asthmatic patients with relevant information in decision aids.

  20. The effect of group bibliotherapy on the self-esteem of female students living in dormitory.

    PubMed

    Salimi, Sepideh; Zare-Farashbandi, Firoozeh; Papi, Ahmad; Samouei, Rahele; Hassanzadeh, Akbar

    2014-01-01

    Bibliotherapy is a supplement, simple, inexpensive and readily available method to treat the diseases that is performed with cooperation of librarians and psychologists or doctors. The aim of this study is the investigation of group bibliotherapy's effect on the self-esteem of the female students of Isfahan University of Medical Sciences Living in Dormitory in 2012. The present study is an interventional semi-experimental study with pre test and post test and control group. The statistical population of study consisted of 32 female students who reside in Isfahan University of Medical Sciences dormitories which control and case groups and the students were divided randomly between these two groups. Data was collected by Cooper Smith Self-esteem questionnaire scale (Cronbach's alpha: 0.85). Two groups were examined by the questionnaire in pre test. Case group received group bibliotherapy for 2 month (8 sessions of 2 hours), while the control group received no training at all. Then, 2 groups were assessed in post test after 1 month. Descriptive statistics (means and frequencies distribution) and inferential statistics (independent t- test, paired t- test and mann whitney) were used and data was analyzed by SPSS20 software. The findings showed that group bibliotherapy had positive and significant effect on general, family, professional and total self esteem of female students living in dormitories, but it had no effect on their social self esteem. Group bibliotherapy can increase female students' self-esteem levels. On the other hand, conducting these studies not only can improve mental health of people, but can also improve their reading habits.

  1. Outcomes of photorefractive keratectomy in patients with atypical topography.

    PubMed

    Movahedan, Hossein; Namvar, Ehsan; Farvardin, Mohsen

    2017-11-01

    Photorefractive keratectomy (PRK) is at risk of serious complications such as corneal ectasia, which can reduce corrected distance visual acuity. The rate of complications of PRK is higher in patients with atypical topography. To determine the outcomes of photorefractive keratectomy in patients with atypical topography. This cross-sectional study was done in 2015 in Shiraz in Iran. We included 85 eyes in this study. The samples were selected using a simple random sampling method. All patients were under evaluation for uncorrected distance visual acuity, corrected distance visual acuity, manifest refraction, corneal topography, central corneal thickness using pentacam, slit-lamp microscopy, and detailed fondus evaluation. The postoperative examination was done 1-7 years after surgery. Data were analyzed using IBM SPSS 21.0 version. To analyze the data, descriptive statistics (frequency, percentage, mean, and standard deviation), chi-square, and independent samples t-test were used. We studied 85 eyes. Among the patients, 23 (27.1%) were male and 62 (72.9%) were female. Mean age of the participants was 28.25±5.55 years. Mean postoperative refraction was - 0.37±0.55 diopters. Keratoconus or corneal ectasia was not reported in any patient in this study. There was no statistically significant difference between SI index before and after operation (p=0.736). Mean preoperative refraction was -3.84 ± 1.46 diopters in males and -4.20±1.96 diopters in females; thus there was not statistically significant difference (p = 0.435). PRK is a safe and efficient photorefractive surgery and is associated with low complication rate in patients with atypical topography.

  2. Weak lensing shear and aperture mass from linear to non-linear scales

    NASA Astrophysics Data System (ADS)

    Munshi, Dipak; Valageas, Patrick; Barber, Andrew J.

    2004-05-01

    We describe the predictions for the smoothed weak lensing shear, γs, and aperture mass,Map, of two simple analytical models of the density field: the minimal tree model and the stellar model. Both models give identical results for the statistics of the three-dimensional density contrast smoothed over spherical cells and only differ by the detailed angular dependence of the many-body density correlations. We have shown in previous work that they also yield almost identical results for the probability distribution function (PDF) of the smoothed convergence, κs. We find that the two models give rather close results for both the shear and the positive tail of the aperture mass. However, we note that at small angular scales (θs<~ 2 arcmin) the tail of the PDF, , for negative Map shows a strong variation between the two models, and the stellar model actually breaks down for θs<~ 0.4 arcmin and Map < 0. This shows that the statistics of the aperture mass provides a very precise probe of the detailed structure of the density field, as it is sensitive to both the amplitude and the detailed angular behaviour of the many-body correlations. On the other hand, the minimal tree model shows good agreement with numerical simulations over all the scales and redshifts of interest, while both models provide a good description of the PDF, , of the smoothed shear components. Therefore, the shear and the aperture mass provide robust and complementary tools to measure the cosmological parameters as well as the detailed statistical properties of the density field.

  3. Automated detection of hospital outbreaks: A systematic review of methods

    PubMed Central

    Buckeridge, David L.; Lepelletier, Didier

    2017-01-01

    Objectives Several automated algorithms for epidemiological surveillance in hospitals have been proposed. However, the usefulness of these methods to detect nosocomial outbreaks remains unclear. The goal of this review was to describe outbreak detection algorithms that have been tested within hospitals, consider how they were evaluated, and synthesize their results. Methods We developed a search query using keywords associated with hospital outbreak detection and searched the MEDLINE database. To ensure the highest sensitivity, no limitations were initially imposed on publication languages and dates, although we subsequently excluded studies published before 2000. Every study that described a method to detect outbreaks within hospitals was included, without any exclusion based on study design. Additional studies were identified through citations in retrieved studies. Results Twenty-nine studies were included. The detection algorithms were grouped into 5 categories: simple thresholds (n = 6), statistical process control (n = 12), scan statistics (n = 6), traditional statistical models (n = 6), and data mining methods (n = 4). The evaluation of the algorithms was often solely descriptive (n = 15), but more complex epidemiological criteria were also investigated (n = 10). The performance measures varied widely between studies: e.g., the sensitivity of an algorithm in a real world setting could vary between 17 and 100%. Conclusion Even if outbreak detection algorithms are useful complementary tools for traditional surveillance, the heterogeneity in results among published studies does not support quantitative synthesis of their performance. A standardized framework should be followed when evaluating outbreak detection methods to allow comparison of algorithms across studies and synthesis of results. PMID:28441422

  4. Simple taper: Taper equations for the field forester

    Treesearch

    David R. Larsen

    2017-01-01

    "Simple taper" is set of linear equations that are based on stem taper rates; the intent is to provide taper equation functionality to field foresters. The equation parameters are two taper rates based on differences in diameter outside bark at two points on a tree. The simple taper equations are statistically equivalent to more complex equations. The linear...

  5. Using Simple Linear Regression to Assess the Success of the Montreal Protocol in Reducing Atmospheric Chlorofluorocarbons

    ERIC Educational Resources Information Center

    Nelson, Dean

    2009-01-01

    Following the Guidelines for Assessment and Instruction in Statistics Education (GAISE) recommendation to use real data, an example is presented in which simple linear regression is used to evaluate the effect of the Montreal Protocol on atmospheric concentration of chlorofluorocarbons. This simple set of data, obtained from a public archive, can…

  6. Defining Nitrogen Kinetics for Air Break in Prebreath

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny

    2010-01-01

    Actual tissue nitrogen (N2) kinetics are complex; the uptake and elimination is often approximated with a single half-time compartment in statistical descriptions of denitrogenation [prebreathe(PB)] protocols. Air breaks during PB complicate N2 kinetics. A comparison of symmetrical versus asymmetrical N2 kinetics was performed using the time to onset of hypobaric decompression sickness (DCS) as a surrogate for actual venous N2 tension. METHODS: Published results of 12 tests involving 179 hypobaric exposures in altitude chambers after PB, with and without airbreaks, provide the complex protocols from which to model N2 kinetics. DCS survival time for combined control and airbreaks were described with an accelerated log logistic model where N2 uptake and elimination before, during, and after the airbreak was computed with a simple exponential function or a function that changed half-time depending on ambient N2 partial pressure. P1N2-P2 = (Delta)P defined decompression dose for each altitude exposure, where P2 was the test altitude and P1N2 was computed N2 pressure at the beginning of the altitude exposure. RESULTS: The log likelihood (LL) without decompression dose (null model) was -155.6, and improved (best-fit) to -97.2 when dose was defined with a 240 min half-time for both N2 elimination and uptake during the PB. The description of DCS survival time was less precise with asymmetrical N2 kinetics, for example, LL was -98.9 with 240 min half-time elimination and 120 min half-time uptake. CONCLUSION: The statistical regression described survival time mechanistically linked to symmetrical N2 kinetics during PBs that also included airbreaks. The results are data-specific, and additional data may change the conclusion. The regression is useful to compute additional PB time to compensate for an airbreak in PB within the narrow range of tested conditions.

  7. Investigating the Influence of Teachers' Characteristics on the Teacher-Student Relations from Students' Perspective at Ilam University of Medical Sciences.

    PubMed

    Maleki, Farajolah; Talaei, Mehri Hosein; Moghadam, Seyed Rahmatollah Mousavi; Shadigo, Shahryar; Taghinejad, Hamid; Mirzaei, Alireza

    2017-06-01

    Establishing an effective teacher-student relationship may affect the quality of learning. Such a complex human relationship may be influenced by various factors in addition to teacher and student. The present study aimed at investigating the influence of teacher characteristics on the Teacher-student relationship from students' perspective. In this descriptive-survey research, statistical population included 1500 students at Ilam University of Medical Sciences Ilam, Iran. Out of which 281 students were selected by simple random sampling, they received and completed series of questionnaires. Data was collected using a researcher-made questionnaire containing 37 Likert type items from which five items measured demographic profile of participants and 32 items measured impact of teacher's characteristics upon the teacher-student relationship. Data was analysed by SPSS software version 16 using descriptive statistics, t-test and One way ANOVA. The current study included 281 students (117 (41.6%) male, 164 (58.4%) female) studying at Ilam University of Medical Sciences. The effect of teachers' characteristics on the teacher-student relationship from the students' perspective in three areas (personal, professional and scientific) scored 4.37±0.54, 4.05±0.27, and 3.91±0.44, respectively. The highest score was related to "respect for students" (Mean=4.74, SD=0.55) and the lowest score was dedicated to 'gender' (Mean=2.40, SD= 0.64). Effect of other studied parameters was also higher than the average level. The findings indicated that teacher-student relationship and consequently the quality of education was overshadowed by the overall characteristics of teacher (namely-personal, professional and scientific). Notably, coupled with the professional and scientific properties of the teacher, his/ her communication skills may also help to provide a favourable learning condition for the students. Therefore, attention to the education of scientific as well as professional skills of the teachers in interaction with students through appropriate workshops and training courses is a matter of great necessity.

  8. Defining Nitrogen Kinetics for Air Break in Prebreathe

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny

    2009-01-01

    Actual tissue nitrogen (N2) kinetics are complex; the uptake and elimination is often approximated with a single half-time compartment in statistical descriptions of denitrogenation [prebreathe (PB)] protocols. Air breaks during PB complicate N2 kinetics. A comparison of symmetrical versus asymmetrical N2 kinetics was performed using the time to onset of hypobaric decompression sickness (DCS) as a surrogate for actual venous N2 tension. Published results of 12 tests involving 179 hypobaric exposures in altitude chambers after PB, with and without air breaks, provide the complex protocols from which to model N2 kinetics. DCS survival time for combined control and air breaks were described with an accelerated log logistic model where N2 uptake and elimination before, during, and after the air break was computed with a simple exponential function or a function that changed half-time depending on ambient N2 partial pressure. P1N2-P2 = delta P defined DCS dose for each altitude exposure, where P2 was the test altitude and P1N2 was computed N2 pressure at the beginning of the altitude exposure. The log likelihood (LL) without DCS dose (null model) was -155.6, and improved (best-fit) to -97.2 when dose was defined with a 240 min half-time for both N2 elimination and uptake during the PB. The description of DCS survival time was less precise with asymmetrical N2 kinetics, for example, LL was -98.9 with 240 min half-time elimination and 120 min half-time uptake. The statistical regression described survival time mechanistically linked to symmetrical N2 kinetics during PBs that also included air breaks. The results are data-specific, and additional data may change the conclusion. The regression is useful to compute additional PB time to compensate for an air break in PB within the narrow range of tested conditions.

  9. Toward unsupervised outbreak detection through visual perception of new patterns

    PubMed Central

    Lévy, Pierre P; Valleron, Alain-Jacques

    2009-01-01

    Background Statistical algorithms are routinely used to detect outbreaks of well-defined syndromes, such as influenza-like illness. These methods cannot be applied to the detection of emerging diseases for which no preexisting information is available. This paper presents a method aimed at facilitating the detection of outbreaks, when there is no a priori knowledge of the clinical presentation of cases. Methods The method uses a visual representation of the symptoms and diseases coded during a patient consultation according to the International Classification of Primary Care 2nd version (ICPC-2). The surveillance data are transformed into color-coded cells, ranging from white to red, reflecting the increasing frequency of observed signs. They are placed in a graphic reference frame mimicking body anatomy. Simple visual observation of color-change patterns over time, concerning a single code or a combination of codes, enables detection in the setting of interest. Results The method is demonstrated through retrospective analyses of two data sets: description of the patients referred to the hospital by their general practitioners (GPs) participating in the French Sentinel Network and description of patients directly consulting at a hospital emergency department (HED). Informative image color-change alert patterns emerged in both cases: the health consequences of the August 2003 heat wave were visualized with GPs' data (but passed unnoticed with conventional surveillance systems), and the flu epidemics, which are routinely detected by standard statistical techniques, were recognized visually with HED data. Conclusion Using human visual pattern-recognition capacities to detect the onset of unexpected health events implies a convenient image representation of epidemiological surveillance and well-trained "epidemiology watchers". Once these two conditions are met, one could imagine that the epidemiology watchers could signal epidemiological alerts, based on "image walls" presenting the local, regional and/or national surveillance patterns, with specialized field epidemiologists assigned to validate the signals detected. PMID:19515246

  10. Asymptotically Optimal and Private Statistical Estimation

    NASA Astrophysics Data System (ADS)

    Smith, Adam

    Differential privacy is a definition of "privacy" for statistical databases. The definition is simple, yet it implies strong semantics even in the presence of an adversary with arbitrary auxiliary information about the database.

  11. An Analysis of Research Methods and Statistical Techniques Used by Doctoral Dissertation at the Education Sciences in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…

  12. Risk Factors for Sexual Violence in the Military: An Analysis of Sexual Assault and Sexual Harassment Incidents and Reporting

    DTIC Science & Technology

    2017-03-01

    53 ix LIST OF TABLES Table 1. Descriptive Statistics for Control Variables by... Statistics for Control Variables by Gender (Random Subsample with Complete Survey) ............................................................30 Table...empirical analysis. Chapter IV describes the summary statistics and results. Finally, Chapter V offers concluding thoughts, study limitations, and

  13. What We Know about Community College Low-Income and Minority Student Outcomes: Descriptive Statistics from National Surveys

    ERIC Educational Resources Information Center

    Bailey, Thomas; Jenkins, Davis; Leinbach, Timothy

    2005-01-01

    This report summarizes the latest available national statistics on access and attainment by low income and minority community college students. The data come from the National Center for Education Statistics' (NCES) Integrated Postsecondary Education Data System (IPEDS) annual surveys of all postsecondary educational institutions and the NCES…

  14. A First Assignment to Create Student Buy-In in an Introductory Business Statistics Course

    ERIC Educational Resources Information Center

    Newfeld, Daria

    2016-01-01

    This paper presents a sample assignment to be administered after the first two weeks of an introductory business focused statistics course in order to promote student buy-in. This assignment integrates graphical displays of data, descriptive statistics and cross-tabulation analysis through the lens of a marketing analysis study. A marketing sample…

  15. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  16. Tested Demonstrations.

    ERIC Educational Resources Information Center

    Gilbert, George L., Ed.

    1988-01-01

    Details three demonstrations for use in chemistry classrooms. Includes: "A Demonstration of Corrosion by Differential Aeration"; "A Simple Demonstration of the Activation Energy Concept"; and "A Boiling Demonstration at Room Temperature." Each description includes equipment, materials, and methods. (CW)

  17. Portable design rules for bulk CMOS

    NASA Technical Reports Server (NTRS)

    Griswold, T. W.

    1982-01-01

    It is pointed out that for the past several years, one school of IC designers has used a simplified set of nMOS geometric design rules (GDR) which is 'portable', in that it can be used by many different nMOS manufacturers. The present investigation is concerned with a preliminary set of design rules for bulk CMOS which has been verified for simple test structures. The GDR are defined in terms of Caltech Intermediate Form (CIF), which is a geometry-description language that defines simple geometrical objects in layers. The layers are abstractions of physical mask layers. The design rules do not presume the existence of any particular design methodology. Attention is given to p-well and n-well CMOS processes, bulk CMOS and CMOS-SOS, CMOS geometric rules, and a description of the advantages of CMOS technology.

  18. Self-organization of cosmic radiation pressure instability. II - One-dimensional simulations

    NASA Technical Reports Server (NTRS)

    Hogan, Craig J.; Woods, Jorden

    1992-01-01

    The clustering of statistically uniform discrete absorbing particles moving solely under the influence of radiation pressure from uniformly distributed emitters is studied in a simple one-dimensional model. Radiation pressure tends to amplify statistical clustering in the absorbers; the absorbing material is swept into empty bubbles, the biggest bubbles grow bigger almost as they would in a uniform medium, and the smaller ones get crushed and disappear. Numerical simulations of a one-dimensional system are used to support the conjecture that the system is self-organizing. Simple statistics indicate that a wide range of initial conditions produce structure approaching the same self-similar statistical distribution, whose scaling properties follow those of the attractor solution for an isolated bubble. The importance of the process for large-scale structuring of the interstellar medium is briefly discussed.

  19. Ab initio joint density-functional theory of solvated electrodes, with model and explicit solvation

    NASA Astrophysics Data System (ADS)

    Arias, Tomas

    2015-03-01

    First-principles guided design of improved electrochemical systems has the potential for great societal impact by making non-fossil-fuel systems economically viable. Potential applications include improvements in fuel-cells, solar-fuel systems (``artificial photosynthesis''), supercapacitors and batteries. Economical fuel-cell systems would enable zero-carbon footprint transportation, solar-fuel systems would directly convert sunlight and water into hydrogen fuel for such fuel-cell vehicles, supercapacitors would enable nearly full recovery of energy lost during vehicle braking thus extending electric vehicle range and acceptance, and economical high-capacity batteries would be central to mitigating the indeterminacy of renewable resources such as wind and solar. Central to the operation of all of the above electrochemical systems is the electrode-electrolyte interface, whose underlying physics is quite rich, yet remains remarkably poorly understood. The essential underlying technical challenge to the first principles studies which could explore this physics is the need to properly represent simultaneously both the interaction between electron-transfer events at the electrode, which demand a quantum mechanical description, and multiscale phenomena in the liquid environment such as the electrochemical double layer (ECDL) and its associated shielding, which demand a statistical description. A direct ab initio approach to this challenge would, in principle, require statistical sampling and thousands of repetitions of already computationally demanding quantum mechanical calculations. This talk will begin with a brief review of a recent advance, joint density-functional theory (JDFT), which allows for a fully rigorous and, in principle, exact representation of the thermodynamic equilibrium between a system described at the quantum-mechanical level and a liquid environment, but without the need for costly sampling. We then shall demonstrate how this approach applies in the electrochemical context and how it is needed for realistic description of solvated electrode systems [], and how simple ``implicit'' polarized continuum methods fail radically in this context. Finally, we shall present a series of results relevant to battery, supercapacitor, and solar-fuel systems, one of which has led to a recent invention disclosure for improving battery cycle lifetimes. Supported as a part of the Energy Materials Center at Cornell, an Energy Frontier Research Center funded by DOE/BES (award de-sc0001086) and by the New York State Division of Science, Technology and Innovation (NYSTAR, award 60923).

  20. Interdisciplinary evaluation of dysphagia: clinical swallowing evaluation and videoendoscopy of swallowing.

    PubMed

    Sordi, Marina de; Mourão, Lucia Figueiredo; Silva, Ariovaldo Armando da; Flosi, Luciana Claudia Leite

    2009-01-01

    Patients with dysphagia have impairments in many aspects, and an interdisciplinary approach is fundamental to define diagnosis and treatment. A joint approach in the clinical and videoendoscopy evaluation is paramount. To study the correlation between the clinical assessment (ACD) and the videoendoscopic (VED) assessment of swallowing by classifying the degree of severity and the qualitative/descriptive analyses of the procedures. cross-sectional, descriptive and comparative. held from March to December of 2006, at the Otolaryngology/Dysphagia ward of a hospital in the country side of São Paulo. 30 dysphagic patients with different disorders were assessed by ACD and VED. The data was classified by means of severity scales and qualitative/ descriptive analysis. the correlation between severity ACD and VED scales pointed to a statistically significant low agreement (KAPA = 0.4) (p=0,006). The correlation between the qualitative/descriptive analysis pointed to an excellent and statistically significant agreement (KAPA=0.962) (p<0.001) concerning the entire sample. the low agreement between the severity scales point to a need to perform both procedures, reinforcing VED as a doable procedure. The descriptive qualitative analysis pointed to an excellent agreement, and such data reinforces our need to understand swallowing as a process.

  1. R and Spatial Data

    EPA Science Inventory

    R is an open source language and environment for statistical computing and graphics that can also be used for both spatial analysis (i.e. geoprocessing and mapping of different types of spatial data) and spatial data analysis (i.e. the application of statistical descriptions and ...

  2. WASP (Write a Scientific Paper) using Excel - 2: Pivot tables.

    PubMed

    Grech, Victor

    2018-02-01

    Data analysis at the descriptive stage and the eventual presentation of results requires the tabulation and summarisation of data. This exercise should always precede inferential statistics. Pivot tables and pivot charts are one of Excel's most powerful and underutilised features, with tabulation functions that immensely facilitate descriptive statistics. Pivot tables permit users to dynamically summarise and cross-tabulate data, create tables in several dimensions, offer a range of summary statistics and can be modified interactively with instant outputs. Large and detailed datasets are thereby easily manipulated making pivot tables arguably the best way to explore, summarise and present data from many different angles. This second paper in the WASP series in Early Human Development provides pointers for pivot table manipulation in Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. A NEW LOOK AT THE EFFECTS OF ANXIETY AND STRESS ON THE PERFORMANCE OF COMPLEX INTELLECTUAL TASKS, STUDY II. SCHOOL ANXIETY AND COGNITIVE FUNCTIONING--EXPLORATORY STUDIES.

    ERIC Educational Resources Information Center

    DUNN, JAMES A.

    THE EFFECTS OF TEST ANXIETY AND TEST STRESS ON THE PERFORMANCE OF TWO DIFFERENT INTELLECTUAL TASKS WERE STUDIED. IT WAS HYPOTHESIZED THAT THE DESCRIPTIVE EFFECTS OF ANXIETY WOULD BE GREATER FOR DIFFICULT BUT SIMPLE TASKS THAN FOR COMPLEX BUT EASY TASKS, AND THAT SITUATIONAL STRESS WOULD BE MORE DISRUPTIVE FOR COMPLEX TASKS THAN FOR SIMPLE TASKS. A…

  4. SOLAR OBLIQUITY INDUCED BY PLANET NINE: SIMPLE CALCULATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, Dong

    2016-12-01

    Bailey et al. and Gomes et al. recently suggested that the 6° misalignment between the Sun’s rotational equator and the orbital plane of the major planets may be produced by forcing from the hypothetical Planet Nine on an inclined orbit. Here, we present a simple yet accurate calculation of the effect, which provides a clear description of how the Sun’s spin orientation depends on the property of Planet Nine in this scenario.

  5. S-SPatt: simple statistics for patterns on Markov chains.

    PubMed

    Nuel, Grégory

    2005-07-01

    S-SPatt allows the counting of patterns occurrences in text files and, assuming these texts are generated from a random Markovian source, the computation of the P-value of a given observation using a simple binomial approximation.

  6. Predictors of Errors of Novice Java Programmers

    ERIC Educational Resources Information Center

    Bringula, Rex P.; Manabat, Geecee Maybelline A.; Tolentino, Miguel Angelo A.; Torres, Edmon L.

    2012-01-01

    This descriptive study determined which of the sources of errors would predict the errors committed by novice Java programmers. Descriptive statistics revealed that the respondents perceived that they committed the identified eighteen errors infrequently. Thought error was perceived to be the main source of error during the laboratory programming…

  7. The Status of Child Nutrition Programs in Colorado.

    ERIC Educational Resources Information Center

    McMillan, Daniel C.; Vigil, Herminia J.

    This report provides descriptive and statistical data on the status of child nutrition programs in Colorado. The report contains descriptions of the National School Lunch Program, school breakfast programs, the Special Milk Program, the Summer Food Service Program, the Nutrition Education and Training Program, state dietary guidelines, Colorado…

  8. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  9. Mathematics of Sensing, Exploitation, and Execution (MSEE) Hierarchical Representations for the Evaluation of Sensed Data

    DTIC Science & Technology

    2016-06-01

    theories of the mammalian visual system, and exploiting descriptive text that may accompany a still image for improved inference. The focus of the Brown...test, computer vision, semantic description , street scenes, belief propagation, generative models, nonlinear filtering, sufficient statistics 16...visual system, and exploiting descriptive text that may accompany a still image for improved inference. The focus of the Brown team was on single images

  10. A Study of relationship between frailty and physical performance in elderly women.

    PubMed

    Jeoung, Bog Ja; Lee, Yang Chool

    2015-08-01

    Frailty is a disorder of multiple inter-related physiological systems. It is unclear whether the level of physical performance factors can serve as markers of frailty and a sign. The purpose of this study was to examine the relationship between frailty and physical performance in elderly women. One hundred fourteen elderly women participated in this study, their aged was from 65 to 80. We were measured 6-min walk test, grip-strength, 30-sec arm curl test, 30-sec chair stand test, 8 foot Up- and Go, Back scratch, chair sit and reach, unipedal stance, BMI, and the frailty with questionnaire. The collected data were analyzed by descriptive statistics, frequencies, correlation analysis, ANOVA, and simple liner regression using the IBM 21. SPSS program. In results, statistic tests showed that there were significant differences between frailty and 6-min walk test, 30-sec arm curl test, 30-sec chair stand test, grip-strength, Back scratch, and BMI. However, we did not find significant differences between frailty and 8 foot Up- and Go, unipedal stance. When the subjects were divided into five groups according to physical performance level, subjects with high 6-min walk, 30-sec arm curl test, chair sit and reach test, and high grip strength had low score frailty. Physical performance factors were strongly associated with decreased frailty, suggesting that physical performance improvements play an important role in preventing or reducing the frailty.

  11. Like Beauty, Complexity is Hard to Define

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    Like beauty, complexity is hard to define and rather easy to identify: nonlinear dynamics, strongly interconnected simple elements, some sort of divisoria aquorum between order and disorder. Before focusing on complexity, let us remember that the theoretical pillars of contemporary physics are mechanics (Newtonian, relativistic, quantum), Maxwell electromagnetism, and (Boltzmann-Gibbs, BG) statistical mechanics - obligatory basic disciplines in any advanced course in physics. The firstprinciple statistical-mechanical approach starts from (microscopic) electro-mechanics and theory of probabilities, and, through a variety of possible mesoscopic descriptions, arrives to (oscopic) thermodynamics. In the middle of this trip, we cross energy and entropy. Energy is related to the possible microscopic configurations of the system, whereas entropy is related to the corresponding probabilities. Therefore, in some sense, entropy represents a concept which, epistemologically speaking, is one step further with regard to energy. The fact that energy is not parameter-independent is very familiar: the kinetic energy of a truck is very different from that of a fly, and the relativistic energy of a fast electron is very different from its classical value, and so on. What about entropy? One hundred and forty years of tradition, and hundreds - we may even say thousands - of impressive theoretical successes of the parameter-free BG entropy have sedimented, in the mind of many scientists, the conviction that it is unique. However, it can be straightforwardly argued that, in general, this is not the case...

  12. Role of gynecologists in reproductive education of adolescent girls in Hungary.

    PubMed

    Varga-Tóth, Andrea; Paulik, Edit

    2015-05-01

    The aim of this study was to assess whether the socioeconomic characteristics of adolescent girls, their knowledge about cervical cancer screening, and their sexual activity are associated with whether or not they have already visited a gynecologist. A self-administered questionnaire-based study was performed among secondary school girls (n = 589) who participated in professional education provided by a pediatric and adolescent gynecologist. The questionnaire comprised sociodemographic characteristics, sexual activity, knowledge on contraceptive methods, cervical screening and sources of their knowledge. Simple descriptive statistics, χ(2) and one-way-anova tests, multivariate logistic regression analysis and Pearson correlation were applied. All statistical analyses were carried out using spss 17.0 for Windows. A total of 50.3% of adolescent girls had already had a sexual contact. Half of the sexually active participants had already visited a gynecologist, and most of them did so due to some kind of complaint. The overall knowledge about cervical screening was quite low; higher knowledge was found among those having visited a gynecologist. Adolescent girls' knowledge on cervical screening was improved by previous visits to a gynecologist. The participation of an expert--a gynecologist--in a comprehensive sexual education program of teenage girls is of high importance in Hungary. © 2014 The Authors. Journal of Obstetrics and Gynaecology Research © 2014 Japan Society of Obstetrics and Gynecology.

  13. Sexual violence against female university students in Ethiopia.

    PubMed

    Adinew, Yohannes Mehretie; Hagos, Mihiret Abreham

    2017-07-24

    Though many women are suffering the consequences of sexual violence, only few victims speak out as it is sensitive and prone to stigma. This lack of data made it difficult to get full picture of the problem and design proper interventions. Thus, the aim of this study was to assess the prevalence and factors associated with sexual violence among female students of Wolaita Sodo University, south Ethiopia. Institution based cross-sectional study was conducted among 462 regular female Wolaita Sodo University students on April 7/2015. Participants were selected by simple random sampling. Data were collected by self-administered questionnaire. Data entry and analysis was done by EPI info and SPSS statistical packages respectively. Descriptive statistics were done. Moreover, bivariate and multivariate analyses were also carried out to identify predictors of sexual violence. The age of respondents ranged from 18 to 26 years. Lifetime sexual violence was found to be 45.4%. However, 36.1% and 24.4% of respondents reported experiencing sexual violence since entering university and in the current academic year respectively. Life time sexual violence was positively associated with witnessing inter-parental violence as a child, rural childhood residence, having regular boyfriend, alcohol consumption and having friends who drink regularly; while it was negatively associated with discussing sexual issues with parents. Sexual violence is a common phenomenon among the students. More detailed research has to be conducted to develop prevention and intervention strategies.

  14. Effect of acupressure vs reflexology on pre-menstrual syndrome among adolescent girls--a pilot study.

    PubMed

    Padmavathi, P

    2014-01-01

    Premenstrual syndrome is the most common of gynaecologic complaints. It affects half of all female adolescents today and represents the leading cause of college/school absenteeism among that population. It was sought to assess the effectiveness of acupressure Vs reflexology on premenstrual syndrome among adolescents. Two-group pre-test and post-test true experimental design was adopted for the study. Forty adolescent girls from Government Girls Secondary School, Erode with pre- menstrual syndrome fulfilling the inclusion criteria were selected by simple random sampling. A pre-test was conducted by using premenstrual symptoms assessment scale. Immediately after pre-test acupressure Vs reflexology was given once a week for 6 weeks and again post-test was conducted to assess the effectiveness of treatment. Collected data was analysed by using descriptive and inferential statistics. In post-test, the mean score of the experimental group I sample was 97.3 (SD = 2.5) and the group II mean score was 70:8 (SD = 10.71) with paired 't' value of 19.2 and 31.9. This showed that the reflexology was more effective than acupressure in enhancing the practice of the sample regarding pre-menstrual syndrome. Statistically no significant association was found between the post-test scores of the sample with their demographic variables. The findings imply the need for educating adolescent girls on effective management of pre-menstrual syndrome.

  15. Appraisal of within- and between-laboratory reproducibility of non-radioisotopic local lymph node assay using flow cytometry, LLNA:BrdU-FCM: comparison of OECD TG429 performance standard and statistical evaluation.

    PubMed

    Yang, Hyeri; Na, Jihye; Jang, Won-Hee; Jung, Mi-Sook; Jeon, Jun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Lim, Kyung-Min; Bae, SeungJin

    2015-05-05

    Mouse local lymph node assay (LLNA, OECD TG429) is an alternative test replacing conventional guinea pig tests (OECD TG406) for the skin sensitization test but the use of a radioisotopic agent, (3)H-thymidine, deters its active dissemination. New non-radioisotopic LLNA, LLNA:BrdU-FCM employs a non-radioisotopic analog, 5-bromo-2'-deoxyuridine (BrdU) and flow cytometry. For an analogous method, OECD TG429 performance standard (PS) advises that two reference compounds be tested repeatedly and ECt(threshold) values obtained must fall within acceptable ranges to prove within- and between-laboratory reproducibility. However, this criteria is somewhat arbitrary and sample size of ECt is less than 5, raising concerns about insufficient reliability. Here, we explored various statistical methods to evaluate the reproducibility of LLNA:BrdU-FCM with stimulation index (SI), the raw data for ECt calculation, produced from 3 laboratories. Descriptive statistics along with graphical representation of SI was presented. For inferential statistics, parametric and non-parametric methods were applied to test the reproducibility of SI of a concurrent positive control and the robustness of results were investigated. Descriptive statistics and graphical representation of SI alone could illustrate the within- and between-laboratory reproducibility. Inferential statistics employing parametric and nonparametric methods drew similar conclusion. While all labs passed within- and between-laboratory reproducibility criteria given by OECD TG429 PS based on ECt values, statistical evaluation based on SI values showed that only two labs succeeded in achieving within-laboratory reproducibility. For those two labs that satisfied the within-lab reproducibility, between-laboratory reproducibility could be also attained based on inferential as well as descriptive statistics. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Statistical Analysis of Research Data | Center for Cancer Research

    Cancer.gov

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data.  The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.

  17. [Application of statistics on chronic-diseases-relating observational research papers].

    PubMed

    Hong, Zhi-heng; Wang, Ping; Cao, Wei-hua

    2012-09-01

    To study the application of statistics on Chronic-diseases-relating observational research papers which were recently published in the Chinese Medical Association Magazines, with influential index above 0.5. Using a self-developed criterion, two investigators individually participated in assessing the application of statistics on Chinese Medical Association Magazines, with influential index above 0.5. Different opinions reached an agreement through discussion. A total number of 352 papers from 6 magazines, including the Chinese Journal of Epidemiology, Chinese Journal of Oncology, Chinese Journal of Preventive Medicine, Chinese Journal of Cardiology, Chinese Journal of Internal Medicine and Chinese Journal of Endocrinology and Metabolism, were reviewed. The rate of clear statement on the following contents as: research objectives, t target audience, sample issues, objective inclusion criteria and variable definitions were 99.43%, 98.57%, 95.43%, 92.86% and 96.87%. The correct rates of description on quantitative and qualitative data were 90.94% and 91.46%, respectively. The rates on correctly expressing the results, on statistical inference methods related to quantitative, qualitative data and modeling were 100%, 95.32% and 87.19%, respectively. 89.49% of the conclusions could directly response to the research objectives. However, 69.60% of the papers did not mention the exact names of the study design, statistically, that the papers were using. 11.14% of the papers were in lack of further statement on the exclusion criteria. Percentage of the papers that could clearly explain the sample size estimation only taking up as 5.16%. Only 24.21% of the papers clearly described the variable value assignment. Regarding the introduction on statistical conduction and on database methods, the rate was only 24.15%. 18.75% of the papers did not express the statistical inference methods sufficiently. A quarter of the papers did not use 'standardization' appropriately. As for the aspect of statistical inference, the rate of description on statistical testing prerequisite was only 24.12% while 9.94% papers did not even employ the statistical inferential method that should be used. The main deficiencies on the application of Statistics used in papers related to Chronic-diseases-related observational research were as follows: lack of sample-size determination, variable value assignment description not sufficient, methods on statistics were not introduced clearly or properly, lack of consideration for pre-requisition regarding the use of statistical inferences.

  18. Polymers at interfaces and in colloidal dispersions.

    PubMed

    Fleer, Gerard J

    2010-09-15

    This review is an extended version of the Overbeek lecture 2009, given at the occasion of the 23rd Conference of ECIS (European Colloid and Interface Society) in Antalya, where I received the fifth Overbeek Gold Medal awarded by ECIS. I first summarize the basics of numerical SF-SCF: the Scheutjens-Fleer version of Self-Consistent-Field theory for inhomogeneous systems, including polymer adsorption and depletion. The conformational statistics are taken from the (non-SCF) DiMarzio-Rubin lattice model for homopolymer adsorption, which enumerates the conformational details exactly by a discrete propagator for the endpoint distribution but does not account for polymer-solvent interaction and for the volume-filling constraint. SF-SCF corrects for this by adjusting the field such that it becomes self-consistent. The model can be generalized to more complex systems: polydispersity, brushes, random and block copolymers, polyelectrolytes, branching, surfactants, micelles, membranes, vesicles, wetting, etc. On a mean-field level the results are exact; the disadvantage is that only numerical data are obtained. Extensions to excluded-volume polymers are in progress. Analytical approximations for simple systems are based upon solving the Edwards diffusion equation. This equation is the continuum variant of the lattice propagator, but ignores the finite segment size (analogous to the Poisson-Boltzmann equation without a Stern layer). By using the discrete propagator for segments next to the surface as the boundary condition in the continuum model, the finite segment size can be introduced into the continuum description, like the ion size in the Stern-Poisson-Boltzmann model. In most cases a ground-state approximation is needed to find analytical solutions. In this way realistic analytical approximations for simple cases can be found, including depletion effects that occur in mixtures of colloids plus non-adsorbing polymers. In the final part of this review I discuss a generalization of the free-volume theory (FVT) for the phase behavior of colloids and non-adsorbing polymer. In FVT the polymer is considered to be ideal: the osmotic pressure Pi follows the Van 't Hoff law, the depletion thickness delta equals the radius of gyration. This restricts the validity of FVT to the so-called colloid limit (polymer much smaller than the colloids). We have been able to find simple analytical approximations for Pi and delta which account for non-ideality and include established results for the semidilute limit. So we could generalize FVT to GFVT, and can now also describe the so-called protein limit (polymer larger than the 'protein-like' colloids), where the binodal polymer concentrations scale in a simple way with the polymer/colloid size ratio. For an intermediate case (polymer size approximately colloid size) we could give a quantitative description of careful experimental data. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Environmental statistics with S-Plus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Millard, S.P.; Neerchal, N.K.

    1999-12-01

    The combination of easy-to-use software with easy access to a description of the statistical methods (definitions, concepts, etc.) makes this book an excellent resource. One of the major features of this book is the inclusion of general information on environmental statistical methods and examples of how to implement these methods using the statistical software package S-Plus and the add-in modules Environmental-Stats for S-Plus, S+SpatialStats, and S-Plus for ArcView.

  20. Writing for the Tube.

    ERIC Educational Resources Information Center

    Lin, Sam Chu

    1989-01-01

    Addresses the differences between reporting for print and reporting for television news. Suggests that television journalists must use a simple, conversational style, while print journalists must be more descriptive. Offers suggestions for taping interviews and writing news scripts. (LS)

  1. An analytical treatment for three neutrino oscillations in the Earth

    NASA Astrophysics Data System (ADS)

    Aguilar-Arevalo, A. A.; D'Olivo, J. C.; Supanitsky, A. D.

    2012-08-01

    A simple, and at the same time accurate, description of the Earth matter effects on the oscillations between three neutrino flavors is given in terms of the Magnus expansion for the evolution operator.

  2. Notes on Experiments.

    ERIC Educational Resources Information Center

    Physics Education, 1984

    1984-01-01

    Describes: (1) experiments using a simple phonocardiograph; (2) radioactivity experiments involving a VELA used as a ratemeter; (3) a 25cm continuously operating Foucault pendulum; and (4) camera control of experiments. Descriptions of equipment needed are provided when applicable. (JN)

  3. Electron heating in a Monte Carlo model of a high Mach number, supercritical, collisionless shock

    NASA Technical Reports Server (NTRS)

    Ellison, Donald C.; Jones, Frank C.

    1987-01-01

    Preliminary work in the investigation of electron injection and acceleration at parallel shocks is presented. A simple model of electron heating that is derived from a unified shock model which includes the effects of an electrostatic potential jump is described. The unified shock model provides a kinetic description of the injection and acceleration of ions and a fluid description of electron heating at high Mach number, supercritical, and parallel shocks.

  4. Statistical Package User’s Guide.

    DTIC Science & Technology

    1980-08-01

    261 C. STACH Nonparametric Descriptive Statistics ... ......... ... 265 D. CHIRA Coefficient of Concordance...135 I.- -a - - W 7- Test Data: This program was tested using data from John Neter and William Wasserman, Applied Linear Statistical Models: Regression...length of data file e. new fileý name (not same as raw data file) 5. Printout as optioned for only. Comments: Ranked data are used for program CHIRA

  5. Necessary and sufficient conditions for the complete controllability and observability of systems in series using the coprime factorization of a rational matrix

    NASA Technical Reports Server (NTRS)

    Callier, F. M.; Nahum, C. D.

    1975-01-01

    The series connection of two linear time-invariant systems that have minimal state space system descriptions is considered. From these descriptions, strict-system-equivalent polynomial matrix system descriptions in the manner of Rosenbrock are derived. They are based on the factorization of the transfer matrix of the subsystems as a ratio of two right or left coprime polynomial matrices. They give rise to a simple polynomial matrix system description of the tandem connection. Theorem 1 states that for the complete controllability and observability of the state space system description of the series connection, it is necessary and sufficient that certain 'denominator' and 'numerator' groups are coprime. Consequences for feedback systems are drawn in Corollary 1. The role of pole-zero cancellations is explained by Lemma 3 and Corollaires 2 and 3.

  6. Anger and depression levels of mothers with premature infants in the neonatal intensive care unit.

    PubMed

    Kardaşözdemir, Funda; AKGüN Şahin, Zümrüt

    2016-02-04

    The aim of this study was to examine anger and depression levels of mothers who had a premature infant in the NICU, and all factors affecting the situation. This descriptive study was performed in the level I and II units of NICU at three state hospitals in Turkey. The data was collected with a demographic questionnaire, "Beck Depression Inventory" and "Anger Expression Scale". Descriptive statistics, parametric and nonparametric statistical tests and Pearson correlation were used in the data analysis. Mothers whose infants are under care in NICU have moderate depression. It has also been determined that mothers' educational level, income level and gender of infants were statistically significant (p <0.05). A positive relationship between depression and trait anger scores was found to be statistically significant. A negative relationship existed between depression and anger-control scores for the mothers, which was statistically significant (p <0.05). Due to the results of research, recommended that mothers who are at risk of depression and anger in the NICU evaluated by nurses and these nurses to develop their consulting roles.

  7. Using R-Project for Free Statistical Analysis in Extension Research

    ERIC Educational Resources Information Center

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  8. Using Data from Climate Science to Teach Introductory Statistics

    ERIC Educational Resources Information Center

    Witt, Gary

    2013-01-01

    This paper shows how the application of simple statistical methods can reveal to students important insights from climate data. While the popular press is filled with contradictory opinions about climate science, teachers can encourage students to use introductory-level statistics to analyze data for themselves on this important issue in public…

  9. Using R in Introductory Statistics Courses with the pmg Graphical User Interface

    ERIC Educational Resources Information Center

    Verzani, John

    2008-01-01

    The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)

  10. Statistics of high-level scene context.

    PubMed

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics rather than intuition.

  11. Evaluating Abstract Art: Relation between Term Usage, Subjective Ratings, Image Properties and Personality Traits.

    PubMed

    Lyssenko, Nathalie; Redies, Christoph; Hayn-Leichsenring, Gregor U

    2016-01-01

    One of the major challenges in experimental aesthetics is the uncertainty of the terminology used in experiments. In this study, we recorded terms that are spontaneously used by participants to describe abstract artworks and studied their relation to the second-order statistical image properties of the same artworks (Experiment 1). We found that the usage frequency of some structure-describing terms correlates with statistical image properties, such as PHOG Self-Similarity, Anisotropy and Complexity. Additionally, emotion-associated terms correlate with measured color values. Next, based on the most frequently used terms, we created five different rating scales (Experiment 2) and obtained ratings of participants for the abstract paintings on these scales. We found significant correlations between descriptive score ratings (e.g., between structure and subjective complexity), between evaluative and descriptive score ratings (e.g., between preference and subjective complexity/interest) and between descriptive score ratings and statistical image properties (e.g., between interest and PHOG Self-Similarity, Complexity and Anisotropy). Additionally, we determined the participants' personality traits as described in the 'Big Five Inventory' (Goldberg, 1990; Rammstedt and John, 2005) and correlated them with the ratings and preferences of individual participants. Participants with higher scores for Neuroticism showed preferences for objectively more complex images, as well as a different notion of the term complex when compared with participants with lower scores for Neuroticism. In conclusion, this study demonstrates an association between objectively measured image properties and the subjective terms that participants use to describe or evaluate abstract artworks. Moreover, our results suggest that the description of abstract artworks, their evaluation and the preference of participants for their low-level statistical properties are linked to personality traits.

  12. Evaluating Abstract Art: Relation between Term Usage, Subjective Ratings, Image Properties and Personality Traits

    PubMed Central

    Lyssenko, Nathalie; Redies, Christoph; Hayn-Leichsenring, Gregor U.

    2016-01-01

    One of the major challenges in experimental aesthetics is the uncertainty of the terminology used in experiments. In this study, we recorded terms that are spontaneously used by participants to describe abstract artworks and studied their relation to the second-order statistical image properties of the same artworks (Experiment 1). We found that the usage frequency of some structure-describing terms correlates with statistical image properties, such as PHOG Self-Similarity, Anisotropy and Complexity. Additionally, emotion-associated terms correlate with measured color values. Next, based on the most frequently used terms, we created five different rating scales (Experiment 2) and obtained ratings of participants for the abstract paintings on these scales. We found significant correlations between descriptive score ratings (e.g., between structure and subjective complexity), between evaluative and descriptive score ratings (e.g., between preference and subjective complexity/interest) and between descriptive score ratings and statistical image properties (e.g., between interest and PHOG Self-Similarity, Complexity and Anisotropy). Additionally, we determined the participants’ personality traits as described in the ‘Big Five Inventory’ (Goldberg, 1990; Rammstedt and John, 2005) and correlated them with the ratings and preferences of individual participants. Participants with higher scores for Neuroticism showed preferences for objectively more complex images, as well as a different notion of the term complex when compared with participants with lower scores for Neuroticism. In conclusion, this study demonstrates an association between objectively measured image properties and the subjective terms that participants use to describe or evaluate abstract artworks. Moreover, our results suggest that the description of abstract artworks, their evaluation and the preference of participants for their low-level statistical properties are linked to personality traits. PMID:27445933

  13. Evaluation of the Nurses' Job Satisfaction, and Its Association with Their Moral Sensitivities and Well-being.

    PubMed

    Jaafarpour, Molouk; Khani, Ali

    2012-12-01

    Several researchers have described the nurses' work as stressful and that the incidence of the occupational stress-related burnout in the profession was high. The aim of this study was to establish the relationship between the nurses' satisfaction, their psychosocial work environment, the levels of their reported moral sensitivities and their well - being in Iran. This descriptive-correlation study was performed at the ILAM general hospitals, IR, during the year 2011. The research instruments which were used were the Psychosocial Work Environment (PWE), the Moral Sensitivity (MS) and the well-being profile of the nurses. A sample of 120 Registered Nurses (RN) were enrolled in the study by using a simple random sampling method. The descriptive statistics and the Pearson's correlation test were performed by using SPSS. The relationship of the nurses' satisfaction and their psychosocial work environment was moderate (M=106.5, SD= 7.2) . The nurses' moral sensitivity was moderate (M=112.3, SD= 11.2).This study found that there were significant correlations between the PWE factors score and the MS subscale (P< 0.05, p<0.01). In addition, significant correlations were found between the nurses' well-being and the PWE factors (P< 0.05, p<0.01). These findings proved that the nurses perceived their PWE as stressful. The supporting nurses may have a positive effect on their perceptions of well-being. The attending nurses reported less physical symptoms, reduced anxiety and fewer feelings of not being in control.

  14. A kinetic theory description of the viscosity of dense fluids consisting of chain molecules.

    PubMed

    de Wijn, Astrid S; Vesovic, Velisa; Jackson, George; Trusler, J P Martin

    2008-05-28

    An expression for the viscosity of a dense fluid is presented that includes the effect of molecular shape. The molecules of the fluid are approximated by chains of equal-sized, tangentially jointed, rigid spheres. It is assumed that the collision dynamics in such a fluid can be approximated by instantaneous collisions between two rigid spheres belonging to different chains. The approach is thus analogous to that of Enskog for a fluid consisting of rigid spheres. The description is developed in terms of two molecular parameters, the diameter sigma of the spherical segment and the chain length (number of segments) m. It is demonstrated that an analysis of viscosity data of a particular pure fluid alone cannot be used to obtain independently effective values of both sigma and m. Nevertheless, the chain lengths of n-alkanes are determined by assuming that the diameter of each rigid sphere making up the chain can be represented by the diameter of a methane molecule. The effective chain lengths of n-alkanes are found to increase linearly with the number C of carbon atoms present. The dependence can be approximated by a simple relationship m=1+(C-1)3. The same relationship was reported within the context of a statistical associating fluid theory equation of state treatment of the fluid, indicating that both the equilibrium thermodynamic properties and viscosity yield the same value for the chain lengths of n-alkanes.

  15. Factors affecting dental service quality.

    PubMed

    Bahadori, Mohammadkarim; Raadabadi, Mehdi; Ravangard, Ramin; Baldacchino, Donia

    2015-01-01

    Measuring dental clinic service quality is the first and most important factor in improving care. The quality provided plays an important role in patient satisfaction. The purpose of this paper is to identify factors affecting dental service quality from the patients' viewpoint. This cross-sectional, descriptive-analytical study was conducted in a dental clinic in Tehran between January and June 2014. A sample of 385 patients was selected from two work shifts using stratified sampling proportional to size and simple random sampling methods. The data were collected, a self-administered questionnaire designed for the purpose of the study, based on the Parasuraman and Zeithaml's model of service quality which consisted of two parts: the patients' demographic characteristics and a 30-item questionnaire to measure the five dimensions of the service quality. The collected data were analysed using SPSS 21.0 and Amos 18.0 through some descriptive statistics such as mean, standard deviation, as well as analytical methods, including confirmatory factor. Results showed that the correlation coefficients for all dimensions were higher than 0.5. In this model, assurance (regression weight=0.99) and tangibility (regression weight=0.86) had, respectively, the highest and lowest effects on dental service quality. The Parasuraman and Zeithaml's model is suitable to measure quality in dental services. The variables related to dental services quality have been made according to the model. This is a pioneering study that uses Parasuraman and Zeithaml's model and CFA in a dental setting. This study provides useful insights and guidance for dental service quality assurance.

  16. Student rating as an effective tool for teacher evaluation.

    PubMed

    Aslam, Muhammad Nadeem

    2013-01-01

    To determine the effectiveness of students' rating as a teacher evaluation tool. Concurrent mixed method. King Edward Medical University, Lahore, from January to June 2010. Anonymous 5-point Likert scale survey questionnaire was conducted involving a single class consisting of 310 students and 12 students were selected for structured interview based on non-probability purposive sampling. Informed consent was procured. They were required to rate 6 teachers and were supposed to discuss teachers' performance in detail. Quantitative data collected through survey was analyzed using SPSS 15 and qualitative data was analyzed with the help of content analysis by identifying themes and patterns from thick descriptions. This student feedback would show the effectiveness in terms of its feasibility and as an indicator of teaching attributes. Descriptive statistics of quantitative data obtained from survey was used to calculate mean and standard deviation for all teachers' individually. This showed the average direction of the student ratings. Percentages of the responses calculated of teacher A were 85.96%, teacher B 65.53, teacher C 65.20%, teacher D 69.62%, teacher E 65.32% and teacher F 64.24% in terms of overall effectiveness of their teaching. Structured interviews generated qualitative data which validated the students' views about strengths and weaknesses of teachers, and helped to determine the effectiveness of their rating and feedback. This simple rating system clearly showed its importance and hence can be used in institutions as a regular evaluating method of teaching faculty.

  17. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  18. Evaluation of Two Statistical Methods Provides Insights into the Complex Patterns of Alternative Polyadenylation Site Switching

    PubMed Central

    Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng

    2015-01-01

    Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641

  19. Techniques for estimating selected streamflow characteristics of rural unregulated streams in Ohio

    USGS Publications Warehouse

    Koltun, G.F.; Whitehead, Matthew T.

    2002-01-01

    This report provides equations for estimating mean annual streamflow, mean monthly streamflows, harmonic mean streamflow, and streamflow quartiles (the 25th-, 50th-, and 75th-percentile streamflows) as a function of selected basin characteristics for rural, unregulated streams in Ohio. The equations were developed from streamflow statistics and basin-characteristics data for as many as 219 active or discontinued streamflow-gaging stations on rural, unregulated streams in Ohio with 10 or more years of homogenous daily streamflow record. Streamflow statistics and basin-characteristics data for the 219 stations are presented in this report. Simple equations (based on drainage area only) and best-fit equations (based on drainage area and at least two other basin characteristics) were developed by means of ordinary least-squares regression techniques. Application of the best-fit equations generally involves quantification of basin characteristics that require or are facilitated by use of a geographic information system. In contrast, the simple equations can be used with information that can be obtained without use of a geographic information system; however, the simple equations have larger prediction errors than the best-fit equations and exhibit geographic biases for most streamflow statistics. The best-fit equations should be used instead of the simple equations whenever possible.

  20. Gaussian statistics for palaeomagnetic vectors

    USGS Publications Warehouse

    Love, J.J.; Constable, C.G.

    2003-01-01

    With the aim of treating the statistics of palaeomagnetic directions and intensities jointly and consistently, we represent the mean and the variance of palaeomagnetic vectors, at a particular site and of a particular polarity, by a probability density function in a Cartesian three-space of orthogonal magnetic-field components consisting of a single (unimoda) non-zero mean, spherically-symmetrical (isotropic) Gaussian function. For palaeomagnetic data of mixed polarities, we consider a bimodal distribution consisting of a pair of such symmetrical Gaussian functions, with equal, but opposite, means and equal variances. For both the Gaussian and bi-Gaussian distributions, and in the spherical three-space of intensity, inclination, and declination, we obtain analytical expressions for the marginal density functions, the cumulative distributions, and the expected values and variances for each spherical coordinate (including the angle with respect to the axis of symmetry of the distributions). The mathematical expressions for the intensity and off-axis angle are closed-form and especially manageable, with the intensity distribution being Rayleigh-Rician. In the limit of small relative vectorial dispersion, the Gaussian (bi-Gaussian) directional distribution approaches a Fisher (Bingham) distribution and the intensity distribution approaches a normal distribution. In the opposite limit of large relative vectorial dispersion, the directional distributions approach a spherically-uniform distribution and the intensity distribution approaches a Maxwell distribution. We quantify biases in estimating the properties of the vector field resulting from the use of simple arithmetic averages, such as estimates of the intensity or the inclination of the mean vector, or the variances of these quantities. With the statistical framework developed here and using the maximum-likelihood method, which gives unbiased estimates in the limit of large data numbers, we demonstrate how to formulate the inverse problem, and how to estimate the mean and variance of the magnetic vector field, even when the data consist of mixed combinations of directions and intensities. We examine palaeomagnetic secular-variation data from Hawaii and Re??union, and although these two sites are on almost opposite latitudes, we find significant differences in the mean vector and differences in the local vectorial variances, with the Hawaiian data being particularly anisotropic. These observations are inconsistent with a description of the mean field as being a simple geocentric axial dipole and with secular variation being statistically symmetrical with respect to reflection through the equatorial plane. Finally, our analysis of palaeomagnetic acquisition data from the 1960 Kilauea flow in Hawaii and the Holocene Xitle flow in Mexico, is consistent with the widely held suspicion that directional data are more accurate than intensity data.

  1. Gaussian statistics for palaeomagnetic vectors

    NASA Astrophysics Data System (ADS)

    Love, J. J.; Constable, C. G.

    2003-03-01

    With the aim of treating the statistics of palaeomagnetic directions and intensities jointly and consistently, we represent the mean and the variance of palaeomagnetic vectors, at a particular site and of a particular polarity, by a probability density function in a Cartesian three-space of orthogonal magnetic-field components consisting of a single (unimodal) non-zero mean, spherically-symmetrical (isotropic) Gaussian function. For palaeomagnetic data of mixed polarities, we consider a bimodal distribution consisting of a pair of such symmetrical Gaussian functions, with equal, but opposite, means and equal variances. For both the Gaussian and bi-Gaussian distributions, and in the spherical three-space of intensity, inclination, and declination, we obtain analytical expressions for the marginal density functions, the cumulative distributions, and the expected values and variances for each spherical coordinate (including the angle with respect to the axis of symmetry of the distributions). The mathematical expressions for the intensity and off-axis angle are closed-form and especially manageable, with the intensity distribution being Rayleigh-Rician. In the limit of small relative vectorial dispersion, the Gaussian (bi-Gaussian) directional distribution approaches a Fisher (Bingham) distribution and the intensity distribution approaches a normal distribution. In the opposite limit of large relative vectorial dispersion, the directional distributions approach a spherically-uniform distribution and the intensity distribution approaches a Maxwell distribution. We quantify biases in estimating the properties of the vector field resulting from the use of simple arithmetic averages, such as estimates of the intensity or the inclination of the mean vector, or the variances of these quantities. With the statistical framework developed here and using the maximum-likelihood method, which gives unbiased estimates in the limit of large data numbers, we demonstrate how to formulate the inverse problem, and how to estimate the mean and variance of the magnetic vector field, even when the data consist of mixed combinations of directions and intensities. We examine palaeomagnetic secular-variation data from Hawaii and Réunion, and although these two sites are on almost opposite latitudes, we find significant differences in the mean vector and differences in the local vectorial variances, with the Hawaiian data being particularly anisotropic. These observations are inconsistent with a description of the mean field as being a simple geocentric axial dipole and with secular variation being statistically symmetrical with respect to reflection through the equatorial plane. Finally, our analysis of palaeomagnetic acquisition data from the 1960 Kilauea flow in Hawaii and the Holocene Xitle flow in Mexico, is consistent with the widely held suspicion that directional data are more accurate than intensity data.

  2. Statistical analysis of vehicle crashes in Mississippi based on crash data from 2010 to 2014.

    DOT National Transportation Integrated Search

    2017-08-15

    Traffic crash data from 2010 to 2014 were collected by Mississippi Department of Transportation (MDOT) and extracted for the study. Three tasks were conducted in this study: (1) geographic distribution of crashes; (2) descriptive statistics of crash ...

  3. Using Carbon Emissions Data to "Heat Up" Descriptive Statistics

    ERIC Educational Resources Information Center

    Brooks, Robert

    2012-01-01

    This article illustrates using carbon emissions data in an introductory statistics assignment. The carbon emissions data has desirable characteristics including: choice of measure; skewness; and outliers. These complexities allow research and public policy debate to be introduced. (Contains 4 figures and 2 tables.)

  4. Statistical mechanics of economics I

    NASA Astrophysics Data System (ADS)

    Kusmartsev, F. V.

    2011-02-01

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  5. How Good Are Statistical Models at Approximating Complex Fitness Landscapes?

    PubMed Central

    du Plessis, Louis; Leventhal, Gabriel E.; Bonhoeffer, Sebastian

    2016-01-01

    Fitness landscapes determine the course of adaptation by constraining and shaping evolutionary trajectories. Knowledge of the structure of a fitness landscape can thus predict evolutionary outcomes. Empirical fitness landscapes, however, have so far only offered limited insight into real-world questions, as the high dimensionality of sequence spaces makes it impossible to exhaustively measure the fitness of all variants of biologically meaningful sequences. We must therefore revert to statistical descriptions of fitness landscapes that are based on a sparse sample of fitness measurements. It remains unclear, however, how much data are required for such statistical descriptions to be useful. Here, we assess the ability of regression models accounting for single and pairwise mutations to correctly approximate a complex quasi-empirical fitness landscape. We compare approximations based on various sampling regimes of an RNA landscape and find that the sampling regime strongly influences the quality of the regression. On the one hand it is generally impossible to generate sufficient samples to achieve a good approximation of the complete fitness landscape, and on the other hand systematic sampling schemes can only provide a good description of the immediate neighborhood of a sequence of interest. Nevertheless, we obtain a remarkably good and unbiased fit to the local landscape when using sequences from a population that has evolved under strong selection. Thus, current statistical methods can provide a good approximation to the landscape of naturally evolving populations. PMID:27189564

  6. Semantic Annotation of Computational Components

    NASA Technical Reports Server (NTRS)

    Vanderbilt, Peter; Mehrotra, Piyush

    2004-01-01

    This paper describes a methodology to specify machine-processable semantic descriptions of computational components to enable them to be shared and reused. A particular focus of this scheme is to enable automatic compositon of such components into simple work-flows.

  7. A Simple MO Treatment of Metal Clusters.

    ERIC Educational Resources Information Center

    Sahyun, M. R. V.

    1980-01-01

    Illustrates how a qualitative description of the geometry and electronic characteristics of homogeneous metal clusters can be obtained using semiempirical MO (molecular orbital theory) methods. Computer applications of MO methods to inorganic systems are also described. (CS)

  8. Analysis of pre-service physics teacher skills designing simple physics experiments based technology

    NASA Astrophysics Data System (ADS)

    Susilawati; Huda, C.; Kurniawan, W.; Masturi; Khoiri, N.

    2018-03-01

    Pre-service physics teacher skill in designing simple experiment set is very important in adding understanding of student concept and practicing scientific skill in laboratory. This study describes the skills of physics students in designing simple experiments based technologicall. The experimental design stages include simple tool design and sensor modification. The research method used is descriptive method with the number of research samples 25 students and 5 variations of simple physics experimental design. Based on the results of interviews and observations obtained the results of pre-service physics teacher skill analysis in designing simple experimental physics charged technology is good. Based on observation result, pre-service physics teacher skill in designing simple experiment is good while modification and sensor application are still not good. This suggests that pre-service physics teacher still need a lot of practice and do experiments in designing physics experiments using sensor modifications. Based on the interview result, it is found that students have high enough motivation to perform laboratory activities actively and students have high curiosity to be skilled at making simple practicum tool for physics experiment.

  9. More on quantum groups from the quantization point of view

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav

    1994-12-01

    Star products on the classical double group of a simple Lie group and on corresponding symplectic groupoids are given so that the quantum double and the “quantized tangent bundle” are obtained in the deformation description. “Complex” quantum groups and bicovariant quantum Lie algebras are discussed from this point of view. Further we discuss the quantization of the Poisson structure on the symmetric algebra S(g) leading to the quantized enveloping algebra U h (g) as an example of biquantization in the sense of Turaev. Description of U h (g) in terms of the generators of the bicovariant differential calculus on F(G q ) is very convenient for this purpose. Finaly we interpret in the deformation framework some well known properties of compact quantum groups as simple consequences of corresponding properties of classical compact Lie groups. An analogue of the classical Kirillov's universal character formula is given for the unitary irreducble representation in the compact case.

  10. Self-organized cell motility

    NASA Astrophysics Data System (ADS)

    Du, Xinxin; Doubrovinski, Konstantin

    2011-03-01

    Cell migration plays a key role in a wide range of biological phenomena, such as morphogenesis, chemotaxis, and wound healing. Cell locomotion relies on the cytoskeleton, a meshwork of filamentous proteins, intrinsically out of thermodynamic equilibrium and cross-linked by molecular motors, proteins that turn chemical energy into mechanical work. In the course of locomotion, cells remain polarized, i.e. they retain a single direction of motion in the absence of external cues. Traditionally, polarization has been attributed to intracellular signaling. However, recent experiments show that polarization may be a consequence of self-organized cytoskeletal dynamics. Our aim is to elucidate the mechanisms by which persistent unidirectional locomotion may arise through simple mechanical interactions of the cytoskeletal proteins. To this end, we develop a simple physical description of cytoskeletal dynamics. We find that the proposed description accounts for a range of phenomena associated with cell motility, including spontaneous polarization, persistent unidirectional motion, and the co-existence of motile and non-motile states.

  11. Characteristic Sizes of Life in the Oceans, from Bacteria to Whales.

    PubMed

    Andersen, K H; Berge, T; Gonçalves, R J; Hartvig, M; Heuschele, J; Hylander, S; Jacobsen, N S; Lindemann, C; Martens, E A; Neuheimer, A B; Olsson, K; Palacz, A; Prowe, A E F; Sainmont, J; Traving, S J; Visser, A W; Wadhwa, N; Kiørboe, T

    2016-01-01

    The size of an individual organism is a key trait to characterize its physiology and feeding ecology. Size-based scaling laws may have a limited size range of validity or undergo a transition from one scaling exponent to another at some characteristic size. We collate and review data on size-based scaling laws for resource acquisition, mobility, sensory range, and progeny size for all pelagic marine life, from bacteria to whales. Further, we review and develop simple theoretical arguments for observed scaling laws and the characteristic sizes of a change or breakdown of power laws. We divide life in the ocean into seven major realms based on trophic strategy, physiology, and life history strategy. Such a categorization represents a move away from a taxonomically oriented description toward a trait-based description of life in the oceans. Finally, we discuss life forms that transgress the simple size-based rules and identify unanswered questions.

  12. Efficient micromagnetic modelling of spin-transfer torque and spin-orbit torque

    NASA Astrophysics Data System (ADS)

    Abert, Claas; Bruckner, Florian; Vogler, Christoph; Suess, Dieter

    2018-05-01

    While the spin-diffusion model is considered one of the most complete and accurate tools for the description of spin transport and spin torque, its solution in the context of dynamical micromagnetic simulations is numerically expensive. We propose a procedure to retrieve the free parameters of a simple macro-spin like spin-torque model through the spin-diffusion model. In case of spin-transfer torque the simplified model complies with the model of Slonczewski. A similar model can be established for the description of spin-orbit torque. In both cases the spin-diffusion model enables the retrieval of free model parameters from the geometry and the material parameters of the system. Since these parameters usually have to be determined phenomenologically through experiments, the proposed method combines the strength of the diffusion model to resolve material parameters and geometry with the high performance of simple torque models.

  13. Attitude towards Pre-Marital Genetic Screening among Students of Osun State Polytechnics in Nigeria

    ERIC Educational Resources Information Center

    Odelola, J. O.; Adisa, O.; Akintaro, O. A.

    2013-01-01

    This study investigated the attitude towards pre-marital genetic screening among students of Osun State Polytechnics. Descriptive survey design was used for the study. The instrument for data collection was self developed and structured questionnaire in four-point likert scale format. Descriptive statistics of frequency count and percentages were…

  14. Basic School Teachers' Perceptions about Curriculum Design in Ghana

    ERIC Educational Resources Information Center

    Abudu, Amadu Musah; Mensah, Mary Afi

    2016-01-01

    This study focused on teachers' perceptions about curriculum design and barriers to their participation. The sample size was 130 teachers who responded to a questionnaire. The analyses made use of descriptive statistics and descriptions. The study found that the level of teachers' participation in curriculum design is low. The results further…

  15. Descriptive and dynamic psychiatry: a perspective on DSM-III.

    PubMed

    Frances, A; Cooper, A M

    1981-09-01

    The APA Task Force on Nomenclature and Statistics attempted to make DSM-III a descriptive nosology that is atheoretical in regard to etiology. The authors believe that a sharp polarity between morphological classification and explanatory formulation is artificial and misleading, and they critically review DSM-III from a psychodynamic perspective. They compare and contrast the descriptive orientation in psychiatry with the psychodynamic orientation and conclude that the two approaches overlap, that they are complementary and necessary to each other, and that there is a descriptive data base underlying dynamic psychiatry which may be usefully included in future nomenclatures.

  16. Quantum-like model for the adaptive dynamics of the genetic regulation of E. coli's metabolism of glucose/lactose.

    PubMed

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2012-06-01

    We developed a quantum-like model describing the gene regulation of glucose/lactose metabolism in a bacterium, Escherichia coli. Our quantum-like model can be considered as a kind of the operational formalism for microbiology and genetics. Instead of trying to describe processes in a cell in the very detail, we propose a formal operator description. Such a description may be very useful in situation in which the detailed description of processes is impossible or extremely complicated. We analyze statistical data obtained from experiments, and we compute the degree of E. coli's preference within adaptive dynamics. It is known that there are several types of E. coli characterized by the metabolic system. We demonstrate that the same type of E. coli can be described by the well determined operators; we find invariant operator quantities characterizing each type. Such invariant quantities can be calculated from the obtained statistical data.

  17. Interactive application of quadratic expansion of chi-square statistic to nonlinear curve fitting

    NASA Technical Reports Server (NTRS)

    Badavi, F. F.; Everhart, Joel L.

    1987-01-01

    This report contains a detailed theoretical description of an all-purpose, interactive curve-fitting routine that is based on P. R. Bevington's description of the quadratic expansion of the Chi-Square statistic. The method is implemented in the associated interactive, graphics-based computer program. Taylor's expansion of Chi-Square is first introduced, and justifications for retaining only the first term are presented. From the expansion, a set of n simultaneous linear equations is derived, then solved by matrix algebra. A brief description of the code is presented along with a limited number of changes that are required to customize the program of a particular task. To evaluate the performance of the method and the goodness of nonlinear curve fitting, two typical engineering problems are examined and the graphical and tabular output of each is discussed. A complete listing of the entire package is included as an appendix.

  18. Longitudinal Patterns of Electronic Teen Dating Violence Among Middle School Students.

    PubMed

    Cutbush, Stacey; Williams, Jason; Miller, Shari; Gibbs, Deborah; Clinton-Sherrod, Monique

    2018-03-01

    We investigated rates and developmental trends of electronic teen dating violence (TDV) perpetration and victimization overall and by gender. Data were collected from a single cohort of seventh-grade students from four schools using paper-and-pencil surveys administered at 6-month intervals ( N = 795). Data were analyzed with descriptive statistics and longitudinal growth models to estimate change over time in TDV. Overall, 32% of youth reported electronic TDV perpetration, and 51% reported electronic TDV victimization. Victimization was more prevalent for boys (42%) than for girls (31%) at baseline only ( t = 2.55, p < .05). Perpetration did not differ at any wave. Perpetration and victimization each decreased significantly from the beginning of seventh grade to the end of eighth grade, β = -.129 (.058), p < .05, for perpetration, and β = -.138 (.048), p < .01, for victimization. Gender moderated the decrease in reported victimization, with simple slopes indicating girls showed almost no change in victimization, β = .006 (.066), ns, whereas boys decreased significantly over the 2 years, β = -.292 (.069), p < .001. Although moderation by gender of change in perpetration was not conventionally significant, the simple slopes revealed that girls again showed a nonsignificant change in TDV across seventh and eighth grades, β = -.067 (.078), ns, whereas boys showed a significant decline in reported electronic TDV perpetration, β = -.197 (.083), p < .05. The high prevalence of electronic TDV underscore the need for addressing these behaviors within TDV prevention interventions.

  19. Entanglement renormalization and topological order.

    PubMed

    Aguado, Miguel; Vidal, Guifré

    2008-02-22

    The multiscale entanglement renormalization ansatz (MERA) is argued to provide a natural description for topological states of matter. The case of Kitaev's toric code is analyzed in detail and shown to possess a remarkably simple MERA description leading to distillation of the topological degrees of freedom at the top of the tensor network. Kitaev states on an infinite lattice are also shown to be a fixed point of the renormalization group flow associated with entanglement renormalization. All of these results generalize to arbitrary quantum double models.

  20. Catalog Descriptions Using VOTable Files

    NASA Astrophysics Data System (ADS)

    Thompson, R.; Levay, K.; Kimball, T.; White, R.

    2008-08-01

    Additional information is frequently required to describe database table contents and make it understandable to users. For this reason, the Multimission Archive at Space Telescope (MAST) creates Òdescription filesÓ for each table/catalog. After trying various XML and CSV formats, we finally chose VOTable. These files are easy to update via an HTML form, easily read using an XML parser such as (in our case) the PHP5 SimpleXML extension, and have found multiple uses in our data access/retrieval process.

  1. Quasi-Monochromatic Visual Environments and the Resting Point of Accommodation

    DTIC Science & Technology

    1988-01-01

    accommodation. No statistically significant differences were revealed to support the possibility of color mediated differential regression to resting...discussed with respect to the general findings of the total sample as well as the specific behavior of individual participants. The summarized statistics ...remaining ten varied considerably with respect to the averaged trends reported in the above descriptive statistics as well as with respect to precision

  2. I'll take that to go: Big data bags and minimal identifiers for exchange of large, complex datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; D'Arcy, Mike; Heavner, Benjamin D.

    Big data workflows often require the assembly and exchange of complex, multi-element datasets. For example, in biomedical applications, the input to an analytic pipeline can be a dataset consisting thousands of images and genome sequences assembled from diverse repositories, requiring a description of the contents of the dataset in a concise and unambiguous form. Typical approaches to creating datasets for big data workflows assume that all data reside in a single location, requiring costly data marshaling and permitting errors of omission and commission because dataset members are not explicitly specified. We address these issues by proposing simple methods and toolsmore » for assembling, sharing, and analyzing large and complex datasets that scientists can easily integrate into their daily workflows. These tools combine a simple and robust method for describing data collections (BDBags), data descriptions (Research Objects), and simple persistent identifiers (Minids) to create a powerful ecosystem of tools and services for big data analysis and sharing. We present these tools and use biomedical case studies to illustrate their use for the rapid assembly, sharing, and analysis of large datasets.« less

  3. Physics-based statistical learning approach to mesoscopic model selection.

    PubMed

    Taverniers, Søren; Haut, Terry S; Barros, Kipton; Alexander, Francis J; Lookman, Turab

    2015-11-01

    In materials science and many other research areas, models are frequently inferred without considering their generalization to unseen data. We apply statistical learning using cross-validation to obtain an optimally predictive coarse-grained description of a two-dimensional kinetic nearest-neighbor Ising model with Glauber dynamics (GD) based on the stochastic Ginzburg-Landau equation (sGLE). The latter is learned from GD "training" data using a log-likelihood analysis, and its predictive ability for various complexities of the model is tested on GD "test" data independent of the data used to train the model on. Using two different error metrics, we perform a detailed analysis of the error between magnetization time trajectories simulated using the learned sGLE coarse-grained description and those obtained using the GD model. We show that both for equilibrium and out-of-equilibrium GD training trajectories, the standard phenomenological description using a quartic free energy does not always yield the most predictive coarse-grained model. Moreover, increasing the amount of training data can shift the optimal model complexity to higher values. Our results are promising in that they pave the way for the use of statistical learning as a general tool for materials modeling and discovery.

  4. Humans make efficient use of natural image statistics when performing spatial interpolation.

    PubMed

    D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S

    2013-12-16

    Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.

  5. Morse Code, Scrabble, and the Alphabet

    ERIC Educational Resources Information Center

    Richardson, Mary; Gabrosek, John; Reischman, Diann; Curtiss, Phyliss

    2004-01-01

    In this paper we describe an interactive activity that illustrates simple linear regression. Students collect data and analyze it using simple linear regression techniques taught in an introductory applied statistics course. The activity is extended to illustrate checks for regression assumptions and regression diagnostics taught in an…

  6. Performing Inferential Statistics Prior to Data Collection

    ERIC Educational Resources Information Center

    Trafimow, David; MacDonald, Justin A.

    2017-01-01

    Typically, in education and psychology research, the investigator collects data and subsequently performs descriptive and inferential statistics. For example, a researcher might compute group means and use the null hypothesis significance testing procedure to draw conclusions about the populations from which the groups were drawn. We propose an…

  7. Inside Rural Pennsylvania: A Statistical Profile.

    ERIC Educational Resources Information Center

    Center for Rural Pennsylvania, Harrisburg.

    Graphs, data tables, maps, and written descriptions give a statistical overview of rural Pennsylvania. A section on rural demographics covers population changes, racial and ethnic makeup, age cohorts, and families and income. Pennsylvania's rural population, the nation's largest, has increased more than its urban population since 1950, with the…

  8. Education Statistics Quarterly, Summer 2002.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2002-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message…

  9. Education Statistics Quarterly, Spring 2002.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2002-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message…

  10. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    ERIC Educational Resources Information Center

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  11. A Laboratory Experiment, Based on the Maillard Reaction, Conducted as a Project in Introductory Statistics

    ERIC Educational Resources Information Center

    Kravchuk, Olena; Elliott, Antony; Bhandari, Bhesh

    2005-01-01

    A simple laboratory experiment, based on the Maillard reaction, served as a project in Introductory Statistics for undergraduates in Food Science and Technology. By using the principles of randomization and replication and reflecting on the sources of variation in the experimental data, students reinforced the statistical concepts and techniques…

  12. From creation and annihilation operators to statistics

    NASA Astrophysics Data System (ADS)

    Hoyuelos, M.

    2018-01-01

    A procedure to derive the partition function of non-interacting particles with exotic or intermediate statistics is presented. The partition function is directly related to the associated creation and annihilation operators that obey some specific commutation or anti-commutation relations. The cases of Gentile statistics, quons, Polychronakos statistics, and ewkons are considered. Ewkons statistics was recently derived from the assumption of free diffusion in energy space (Hoyuelos and Sisterna, 2016); an ideal gas of ewkons has negative pressure, a feature that makes them suitable for the description of dark energy.

  13. A simple device for teaching direct ophthalmoscopy to primary care practitioners.

    PubMed

    Chung, Kelly D; Watzke, Robert C

    2004-09-01

    Ophthalmoscopy, a valuable skill for primary care practitioners, can be challenging to learn. A simple and inexpensive device for teaching direct ophthalmoscopy to primary care practitioners is described. Device description. Cylindrical plastic canisters were altered to have an artificial pupil at one end and a replaceable fundus photograph at the other end to simulate the mechanics of performing direct ophthalmoscopy on a real eye. These were tested for ease of use by primary care students. The devices to aid in teaching ophthalmoscopy proved to be simple and inexpensive to construct. They allowed students to practice direct ophthalmoscopy technique and identification of funduscopic abnormalities. This simple device for teaching direct ophthalmoscopy to primary care practitioners is inexpensive to create and is a valuable aid for teaching direct ophthalmoscopy to primary care practitioners.

  14. On a Family of Circles

    ERIC Educational Resources Information Center

    Feeman, Timothy G.

    2011-01-01

    We generalize a standard example from precalculus and calculus texts to give a simple description in polar coordinates of any circle that passes through the origin. We discuss an occurrence of this formula in the context of medical imaging. (Contains 1 figure.)

  15. Easy Games for the German Classroom

    ERIC Educational Resources Information Center

    Kaiser, Linda

    1977-01-01

    The following easy games for the German classroom are described: around Germany, map identifications, fast response, bingo, memory training, geography college bowl, find the object, descriptions, pronunciation perfection, acrostics, simple board games, art lesson, treasure hunt, and miscellaneous games. (SW)

  16. Parallel auto-correlative statistics with VTK.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  17. The quality assessment of family physician service in rural regions, Northeast of Iran in 2012

    PubMed Central

    Vafaee-Najar, Ali; Nejatzadegan, Zohreh; Pourtaleb, Arefeh; Kaffashi, Shahnaz; Vejdani, Marjan; Molavi-Taleghani, Yasamin; Ebrahimipour, Hosein

    2014-01-01

    Background: Following the implementation of family physician plan in rural areas, the quantity of provided services has been increased, but what leads on the next topic is the improvement in expected quality of service, as well. The present study aims at determining the gap between patients’ expectation and perception from the quality of services provided by family physicians during the spring and summer of 2012. Methods: This was a cross-sectional study in which 480 patients who referred to family physician centers were selected with clustering and simple randomized method. Data were collected through SERVQUAL standard questionnaire and were analyzed with descriptive statistics, using statistical T-test, Kruskal-Wallis, and Wilcoxon signed-rank tests by SPSS 16 at a significance level of 0.05. Results: The difference between the mean scores of expectation and perception was about -0.93, which is considered as statistically significant difference (P≤ 0.05). Also, the differences in five dimensions of quality were as follows: tangible -1.10, reliability -0.87, responsiveness -1.06, assurance -0.83, and empathy -0.82. Findings showed that there was a significant difference between expectation and perception in five concepts of the provided services (P≤ 0.05). Conclusion: There was a gap between the ideal situation and the current situation of family physician quality of services. Our suggestion is maintaining a strong focus on patients, creating a medical practice that would exceed patients’ expectations, providing high-quality healthcare services, and realizing the continuous improvement of all processes. In both tangible and responsive, the gap was greater than the other dimensions. It is recommended that more attention should be paid to the physical appearance of the health center environment and the availability of staff and employees. PMID:24757691

  18. Automated clustering-based workload characterization

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena

    1996-01-01

    The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.

  19. The effect of group bibliotherapy on the self-esteem of female students living in dormitory

    PubMed Central

    Salimi, Sepideh; Zare-Farashbandi, Firoozeh; Papi, Ahmad; Samouei, Rahele; Hassanzadeh, Akbar

    2014-01-01

    Introduction: Bibliotherapy is a supplement, simple, inexpensive and readily available method to treat the diseases that is performed with cooperation of librarians and psychologists or doctors. The aim of this study is the investigation of group bibliotherapy's effect on the self-esteem of the female students of Isfahan University of Medical Sciences Living in Dormitory in 2012. Materials and Methods: The present study is an interventional semi-experimental study with pre test and post test and control group. The statistical population of study consisted of 32 female students who reside in Isfahan University of Medical Sciences dormitories which control and case groups and the students were divided randomly between these two groups. Data was collected by Cooper Smith Self-esteem questionnaire scale (Cronbach's alpha: 0.85). Two groups were examined by the questionnaire in pre test. Case group received group bibliotherapy for 2 month (8 sessions of 2 hours), while the control group received no training at all. Then, 2 groups were assessed in post test after 1 month. Descriptive statistics (means and frequencies distribution) and inferential statistics (independent t- test, paired t- test and mann whitney) were used and data was analyzed by SPSS20 software. Results: The findings showed that group bibliotherapy had positive and significant effect on general, family, professional and total self esteem of female students living in dormitories, but it had no effect on their social self esteem. Conclusion: Group bibliotherapy can increase female students’ self-esteem levels. On the other hand, conducting these studies not only can improve mental health of people, but can also improve their reading habits. PMID:25250355

  20. Longitudinal Analysis of Superficial Midfacial Fat Volumes Over a 10-Year Period.

    PubMed

    Tower, Jacob; Seifert, Kimberly; Paskhover, Boris

    2018-04-11

    Volumetric changes to facial fat that occur with aging remain poorly understood. The aim of this study was to evaluate for longitudinal changes to midfacial fat volumes in a group of individuals. We conducted a retrospective longitudinal study of adult subjects who underwent multiple facial computed tomographic (CT) scans timed at least 8 years apart. Subjects who underwent facial surgery or suffered facial trauma were excluded. Facial CT scans were analyzed, and superficial cheek fat volumes were measured and compared to track changes that occurred with aging. Fourteen subjects were included in our analysis of facial aging (5 male, 9 female; mean initial age 50.9 years; mean final age 60.4 years). In the right superficial cheek there was an increase in mean (SD) superficial fat volume from 10.33 (2.01) to 10.50 (1.80) cc, which was not statistically significant (P = 0.75). Similar results were observed in the left cheek. There were no statistically significant longitudinal changes to caudal, middle, or cephalad subdivisions of bilateral superficial cheek fat. A simple linear regression was performed to predict superficial cheek fat pad volume based on age which did not reach statistical significance (P = 0.31), with an R 2 of 0.039. This study is the first to quantitatively assess for longitudinal changes to midfacial fat in a group of individuals. Superficial cheek fat remained stable as subjects aged from approximately 50 to 60 years old, with no change in total volume or redistribution within a radiographically defined compartment. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  1. Analysis of Flow and Transport in non-Gaussian Heterogeneous Formations Using a Generalized Sub-Gaussian Model

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Riva, M.; Neuman, S. P.

    2016-12-01

    Environmental quantities such as log hydraulic conductivity (or transmissivity), Y(x) = ln K(x), and their spatial (or temporal) increments, ΔY, are known to be generally non-Gaussian. Documented evidence of such behavior includes symmetry of increment distributions at all separation scales (or lags) between incremental values of Y with sharp peaks and heavy tails that decay asymptotically as lag increases. This statistical scaling occurs in porous as well as fractured media characterized by either one or a hierarchy of spatial correlation scales. In hierarchical media one observes a range of additional statistical ΔY scaling phenomena, all of which are captured comprehensibly by a novel generalized sub-Gaussian (GSG) model. In this model Y forms a mixture Y(x) = U(x) G(x) of single- or multi-scale Gaussian processes G having random variances, U being a non-negative subordinator independent of G. Elsewhere we developed ways to generate unconditional and conditional random realizations of isotropic or anisotropic GSG fields which can be embedded in numerical Monte Carlo flow and transport simulations. Here we present and discuss expressions for probability distribution functions of Y and ΔY as well as their lead statistical moments. We then focus on a simple flow setting of mean uniform steady state flow in an unbounded, two-dimensional domain, exploring ways in which non-Gaussian heterogeneity affects stochastic flow and transport descriptions. Our expressions represent (a) lead order autocovariance and cross-covariance functions of hydraulic head, velocity and advective particle displacement as well as (b) analogues of preasymptotic and asymptotic Fickian dispersion coefficients. We compare them with corresponding expressions developed in the literature for Gaussian Y.

  2. Integrated Reconfigurable Intelligent Systems (IRIS) for Complex Naval Systems

    DTIC Science & Technology

    2011-02-23

    INTRODUCTION 35 2.2 GENERAL MODEL SETUP 36 2.2.1 Co-Simulation Principles 36 2.2.2 Double pendulum : a simple example 38 2.2.3 Description of numerical... pendulum sample problem 45 2.3 DISCUSSION OF APPROACH WITH RESPECT TO PROPOSED SUBTASKS 49 2.4 RESULTS DISCUSSION AND FUTURE WORK 49 TASK 3...Kim and Praehofer 2000]. 2.2.2 Double pendulum : a simple example In order to be able to evaluate co-simulation principles, specifically an

  3. Cooling tower plume - model and experiment

    NASA Astrophysics Data System (ADS)

    Cizek, Jan; Gemperle, Jiri; Strob, Miroslav; Nozicka, Jiri

    The paper discusses the description of the simple model of the, so-called, steam plume, which in many cases forms during the operation of the evaporative cooling systems of the power plants, or large technological units. The model is based on semi-empirical equations that describe the behaviour of a mixture of two gases in case of the free jet stream. In the conclusion of the paper, a simple experiment is presented through which the results of the designed model shall be validated in the subsequent period.

  4. The visual display of regulatory information and networks.

    PubMed

    Pirson, I; Fortemaison, N; Jacobs, C; Dremier, S; Dumont, J E; Maenhaut, C

    2000-10-01

    Cell regulation and signal transduction are becoming increasingly complex, with reports of new cross-signalling, feedback, and feedforward regulations between pathways and between the multiple isozymes discovered at each step of these pathways. However, this information, which requires pages of text for its description, can be summarized in very simple schemes, although there is no consensus on the drawing of such schemes. This article presents a simple set of rules that allows a lot of information to be inserted in easily understandable displays.

  5. Mathematical Description of the Uptake of Hydrocarbons in Jet Fuel into the Stratum Corneum of Human Volunteers

    PubMed Central

    Kim, David; Farthing, Matthew W.; Miller, Cass T.; Nylander-French, Leena A.

    2008-01-01

    The objective of this research was to develop a mathematical description of uptake of aromatic and aliphatic hydrocarbons into the stratum corneum of human skin in vivo. A simple description based on Fick’s Laws of diffusion was used to predict the spatiotemporal variation of naphthalene, 1- and 2-methylnaphthalene, undecane, and dodecane in the stratum corneum of human volunteers. The estimated values of the diffusion coefficients for each chemical were comparable to values predicted using in vitro skin systems and biomonitoring studies. These results demonstrate the value of measuring dermal exposure using the tape-strip technique and the importance of quantifying of dermal uptake. PMID:18423910

  6. Influence of simulation parameters on the speed and accuracy of Monte Carlo calculations using PENEPMA

    NASA Astrophysics Data System (ADS)

    Llovet, X.; Salvat, F.

    2018-01-01

    The accuracy of Monte Carlo simulations of EPMA measurements is primarily determined by that of the adopted interaction models and atomic relaxation data. The code PENEPMA implements the most reliable general models available, and it is known to provide a realistic description of electron transport and X-ray emission. Nonetheless, efficiency (i.e., the simulation speed) of the code is determined by a number of simulation parameters that define the details of the electron tracking algorithm, which may also have an effect on the accuracy of the results. In addition, to reduce the computer time needed to obtain X-ray spectra with a given statistical accuracy, PENEPMA allows the use of several variance-reduction techniques, defined by a set of specific parameters. In this communication we analyse and discuss the effect of using different values of the simulation and variance-reduction parameters on the speed and accuracy of EPMA simulations. We also discuss the effectiveness of using multi-core computers along with a simple practical strategy implemented in PENEPMA.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willis-Richards, J.; Watanable, K.; Yamaguchi, T.

    A set of models of HDR systems is presented which attempts to explain the formation and operation of HDR systems using only the in-situ properties of the fractured rock mass, the earth stress field, the engineering intervention applied by way of stimulation and the relative positions and pressures of the well(s). A statistical and rock mechanics description of fractures in low permeability rocks provides the basis for modeling of stimulation, circulation and water loss in HDR systems. The model uses a large number of parameters, chiefly simple directly measurable quantities, describing the rock mass and fracture system. The effect ofmore » stimulation (raised fluid pressure allowing slip) on fracture apertures is calculated, and the volume of rock affected per volume of fluid pumped estimated. The total rock volume affected by stimulation is equated with the rock volume containing the associated AE (microseismicity). The aperture and compliance properties of the stimulated fractures are used to estimate impedance and flow within the reservoir. Fluid loss from the boundary of the stimulated volume is treated using radial leak-off with pressure-dependent permeability.« less

  8. Extracting information from S-curves of language change

    PubMed Central

    Ghanbarnejad, Fakhteh; Gerlach, Martin; Miotto, José M.; Altmann, Eduardo G.

    2014-01-01

    It is well accepted that adoption of innovations are described by S-curves (slow start, accelerating period and slow end). In this paper, we analyse how much information on the dynamics of innovation spreading can be obtained from a quantitative description of S-curves. We focus on the adoption of linguistic innovations for which detailed databases of written texts from the last 200 years allow for an unprecedented statistical precision. Combining data analysis with simulations of simple models (e.g. the Bass dynamics on complex networks), we identify signatures of endogenous and exogenous factors in the S-curves of adoption. We propose a measure to quantify the strength of these factors and three different methods to estimate it from S-curves. We obtain cases in which the exogenous factors are dominant (in the adoption of German orthographic reforms and of one irregular verb) and cases in which endogenous factors are dominant (in the adoption of conventions for romanization of Russian names and in the regularization of most studied verbs). These results show that the shape of S-curve is not universal and contains information on the adoption mechanism. PMID:25339692

  9. Molecular dynamics of conformational substates for a simplified protein model

    NASA Astrophysics Data System (ADS)

    Grubmüller, Helmut; Tavan, Paul

    1994-09-01

    Extended molecular dynamics simulations covering a total of 0.232 μs have been carried out on a simplified protein model. Despite its simplified structure, that model exhibits properties similar to those of more realistic protein models. In particular, the model was found to undergo transitions between conformational substates at a time scale of several hundred picoseconds. The computed trajectories turned out to be sufficiently long as to permit a statistical analysis of that conformational dynamics. To check whether effective descriptions neglecting memory effects can reproduce the observed conformational dynamics, two stochastic models were studied. A one-dimensional Langevin effective potential model derived by elimination of subpicosecond dynamical processes could not describe the observed conformational transition rates. In contrast, a simple Markov model describing the transitions between but neglecting dynamical processes within conformational substates reproduced the observed distribution of first passage times. These findings suggest, that protein dynamics generally does not exhibit memory effects at time scales above a few hundred picoseconds, but confirms the existence of memory effects at a picosecond time scale.

  10. SPSS and SAS procedures for estimating indirect effects in simple mediation models.

    PubMed

    Preacher, Kristopher J; Hayes, Andrew F

    2004-11-01

    Researchers often conduct mediation analysis in order to indirectly assess the effect of a proposed cause on some outcome through a proposed mediator. The utility of mediation analysis stems from its ability to go beyond the merely descriptive to a more functional understanding of the relationships among variables. A necessary component of mediation is a statistically and practically significant indirect effect. Although mediation hypotheses are frequently explored in psychological research, formal significance tests of indirect effects are rarely conducted. After a brief overview of mediation, we argue the importance of directly testing the significance of indirect effects and provide SPSS and SAS macros that facilitate estimation of the indirect effect with a normal theory approach and a bootstrap approach to obtaining confidence intervals, as well as the traditional approach advocated by Baron and Kenny (1986). We hope that this discussion and the macros will enhance the frequency of formal mediation tests in the psychology literature. Electronic copies of these macros may be downloaded from the Psychonomic Society's Web archive at www.psychonomic.org/archive/.

  11. Validation of a high-performance size-exclusion chromatography method to determine and characterize β-glucans in beer wort using a triple-detector array.

    PubMed

    Tomasi, Ivan; Marconi, Ombretta; Sileoni, Valeria; Perretti, Giuseppe

    2017-01-01

    Beer wort β-glucans are high-molecular-weight non-starch polysaccharides of that are great interest to the brewing industries. Because glucans can increase the viscosity of the solutions and form gels, hazes, and precipitates, they are often related to poor lautering performance and beer filtration problems. In this work, a simple and suitable method was developed to determine and characterize β-glucans in beer wort using size exclusion chromatography coupled with a triple-detector array, which is composed of a light scatterer, a viscometer, and a refractive-index detector. The method performances are comparable to the commercial reference method as result from the statistical validation and enable one to obtain interesting parameters of β-glucan in beer wort, such as the molecular weight averages, fraction description, hydrodynamic radius, intrinsic viscosity, polydispersity and Mark-Houwink parameters. This characterization can be useful in brewing science to understand filtration problems, which are not always explained through conventional analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Dalitz plot analysis of the D+→K-π+π+ decay in the FOCUS experiment

    NASA Astrophysics Data System (ADS)

    Link, J. M.; Yager, P. M.; Anjos, J. C.; Bediaga, I.; Castromonte, C.; Machado, A. A.; Magnin, J.; Massafferri, A.; de Miranda, J. M.; Pepe, I. M.; Polycarpo, E.; Dos Reis, A. C.; Carrillo, S.; Casimiro, E.; Cuautle, E.; Sánchez-Hernández, A.; Uribe, C.; Vázquez, F.; Agostino, L.; Cinquini, L.; Cumalat, J. P.; Frisullo, V.; O'Reilly, B.; Segoni, I.; Stenson, K.; Butler, J. N.; Cheung, H. W. K.; Chiodini, G.; Gaines, I.; Garbincius, P. H.; Garren, L. A.; Gottschalk, E.; Kasper, P. H.; Kreymer, A. E.; Kutschke, R.; Wang, M.; Benussi, L.; Bianco, S.; Fabbri, F. L.; Zallo, A.; Reyes, M.; Cawlfield, C.; Kim, D. Y.; Rahimi, A.; Wiss, J.; Gardner, R.; Kryemadhi, A.; Chung, Y. S.; Kang, J. S.; Ko, B. R.; Kwak, J. W.; Lee, K. B.; Cho, K.; Park, H.; Alimonti, G.; Barberis, S.; Boschini, M.; Cerutti, A.; D'Angelo, P.; Dicorato, M.; Dini, P.; Edera, L.; Erba, S.; Inzani, P.; Leveraro, F.; Malvezzi, S.; Menasce, D.; Mezzadri, M.; Moroni, L.; Pedrini, D.; Pontoglio, C.; Prelz, F.; Rovere, M.; Sala, S.; Davenport, T. F.; Arena, V.; Boca, G.; Bonomi, G.; Gianini, G.; Liguori, G.; Lopes Pegna, D.; Merlo, M. M.; Pantea, D.; Ratti, S. P.; Riccardi, C.; Vitulo, P.; Göbel, C.; Otalora, J.; Hernandez, H.; Lopez, A. M.; Mendez, H.; Paris, A.; Quinones, J.; Ramirez, J. E.; Zhang, Y.; Wilson, J. R.; Handler, T.; Mitchell, R.; Engh, D.; Hosack, M.; Johns, W. E.; Luiggi, E.; Nehring, M.; Sheldon, P. D.; Vaandering, E. W.; Webster, M.; Sheaff, M.; Pennington, M. R.; Focus Collaboration

    2007-09-01

    Using data collected by the high-energy photoproduction experiment FOCUS at Fermilab we performed a Dalitz plot analysis of the Cabibbo favored decay D+ →K-π+π+. This study uses 53653 Dalitz-plot events with a signal fraction of ∼ 97%, and represents the highest statistics, most complete Dalitz plot analysis for this channel. Results are presented and discussed using two different formalisms. The first is a simple sum of Breit-Wigner functions with freely fitted masses and widths. It is the model traditionally adopted and serves as comparison with the already published analyses. The second uses a K-matrix approach for the dominant S-wave, in which the parameters are fixed by first fitting Kπ scattering data and continued to threshold by Chiral Perturbation Theory. We show that the Dalitz plot distribution for this decay is consistent with the assumption of two-body dominance of the final state interactions and the description of these interactions is in agreement with other data on the Kπ final state.

  13. Stochastic dynamics of intermittent pore-scale particle motion in three-dimensional porous media

    NASA Astrophysics Data System (ADS)

    Morales, V. L.; Dentz, M.; Willmann, M.; Holzner, M.

    2017-12-01

    A proper understanding of velocity dynamics is key for making transport predictions through porous media at any scale. We study the velocity evolution process from particle dynamics at the pore-scale with particular interest in preasymptotic (non-Fickian) behavior. Experimental measurements from 3-dimensional particle tracking velocimetry are used to obtain Lagrangian velocity statistics for three different types of media heterogeneity. Particle velocities are found to be intermittent in nature, log-normally distributed and non-stationary. We show that these velocity characteristics can be captured with a correlated Ornstein-Uhlenbeck process for a random walk in space that is parameterized from velocity distributions. Our simple model is rigorously tested for accurate reproduction of velocity variability in magnitude and frequency. We further show that it captures exceptionally well the preasymptotic mean and mean squared displacement in the ballistic and superdiffusive regimes, and can be extended to determine if and when Fickian behavior will be reached. Our approach reproduces both preasymptotic and asymptotic transport behavior with a single transport model, demonstrating correct description of the fundamental controls of anomalous transport.

  14. Bose-Einstein condensation of paraxial light

    NASA Astrophysics Data System (ADS)

    Klaers, J.; Schmitt, J.; Damm, T.; Vewinger, F.; Weitz, M.

    2011-10-01

    Photons, due to the virtually vanishing photon-photon interaction, constitute to very good approximation an ideal Bose gas, but owing to the vanishing chemical potential a (free) photon gas does not show Bose-Einstein condensation. However, this is not necessarily true for a lower-dimensional photon gas. By means of a fluorescence induced thermalization process in an optical microcavity one can achieve a thermal photon gas with freely adjustable chemical potential. Experimentally, we have observed thermalization and subsequently Bose-Einstein condensation of the photon gas at room temperature. In this paper, we give a detailed description of the experiment, which is based on a dye-filled optical microcavity, acting as a white-wall box for photons. Thermalization is achieved in a photon number-conserving way by photon scattering off the dye molecules, and the cavity mirrors both provide an effective photon mass and a confining potential-key prerequisites for the Bose-Einstein condensation of photons. The experimental results are in good agreement with both a statistical and a simple rate equation model, describing the properties of the thermalized photon gas.

  15. Fluctuations in the DNA double helix

    NASA Astrophysics Data System (ADS)

    Peyrard, M.; López, S. C.; Angelov, D.

    2007-08-01

    DNA is not the static entity suggested by the famous double helix structure. It shows large fluctuational openings, in which the bases, which contain the genetic code, are temporarily open. Therefore it is an interesting system to study the effect of nonlinearity on the physical properties of a system. A simple model for DNA, at a mesoscopic scale, can be investigated by computer simulation, in the same spirit as the original work of Fermi, Pasta and Ulam. These calculations raise fundamental questions in statistical physics because they show a temporary breaking of equipartition of energy, regions with large amplitude fluctuations being able to coexist with regions where the fluctuations are very small, even when the model is studied in the canonical ensemble. This phenomenon can be related to nonlinear excitations in the model. The ability of the model to describe the actual properties of DNA is discussed by comparing theoretical and experimental results for the probability that base pairs open an a given temperature in specific DNA sequences. These studies give us indications on the proper description of the effect of the sequence in the mesoscopic model.

  16. Research methodology and applied statistics. Part 2: the literature search.

    PubMed

    Prince, B; Makrides, L; Richman, J

    1980-01-01

    This paper presents a basic methodology for an effective and efficient retrieval and recording of written materials in a subject area. The purpose of the literature review is examined and the criteria for selection of materials for inclusion are outlined. The methodology then describes the role of the librarian, various types of information resources, how to choose appropriate indexing and abstracting services, and a simple efficient method of recording the items found. The importance and use of Medical Subject Headings for research in physiotherapy is emphasized. A survey of types of book materials and how to locate them is followed by a detailed description of the most useful indexing and abstracting services available, in particular, the publications of the National Library of Medicine, notably Index Medicus, as well as Excerpta Medica and the Science Citation Index. A discussion of on-line search services, their coverage and availability in Canada, concludes the review of information sources. Finally, guidelines for selecting and summarizing the materials located and comments on the literary style for a review are supplied.

  17. Awareness of academic use of smartphones and medical apps among medical students in a private medical college?

    PubMed

    Shah, Jehanzaib; Haq, Usman; Bashir, Ali; Shah, Syed Aslam

    2016-02-01

    To assess the awareness of medical apps and academic use of smartphones among medical students. The questionnaire-based descriptive cross-sectional study was conducted in January 2015 and comprised medical students of the Rawal Institute of Health Sciences, Islamabad, Pakistan. The self-designed questionnaire was reviewed by a panel of expert for content reliability and validity. Questionnaires were distributed in the classrooms and were filled by the students anonymously. SPSS 16 was used for statistical analysis. Among the 569 medical students in the study, 545 (95.8%) had smartphones and 24(4.2%) were using simple cell phones. Overall, 226(41.46%) of the smart phone users were using some medical apps. Besides, 137(24.08%) were aware of the medical apps but were not using them. Also, 391(71.7%) students were not using any type of medical text eBooks through their phone, and only 154(28.3%) had relevant text eBooks in their phones. Medical college students were using smartphones mostly as a means of telecommunication rather than a gadget for improving medical knowledge.

  18. The impact of fiscal austerity on suicide: on the empirics of a modern Greek tragedy.

    PubMed

    Antonakakis, Nikolaos; Collins, Alan

    2014-07-01

    Suicide rates in Greece (and other European countries) have been on a remarkable upward trend following the global recession of 2008 and the European sovereign debt crisis of 2009. However, recent investigations of the impact on Greek suicide rates from the 2008 financial crisis have restricted themselves to simple descriptive or correlation analyses. Controlling for various socio-economic effects, this study presents a statistically robust model to explain the influence on realised suicidality of the application of fiscal austerity measures and variations in macroeconomic performance over the period 1968-2011. The responsiveness of suicide to levels of fiscal austerity is established as a means of providing policy guidance on the extent of suicide behaviour associated with different fiscal austerity measures. The results suggest (i) significant age and gender specificity in these effects on suicide rates and that (ii) remittances have suicide-reducing effects on the youth and female population. These empirical regularities potentially offer some guidance on the demographic targeting of suicide prevention measures and the case for 'economic' migration. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Statistical bias correction method applied on CMIP5 datasets over the Indian region during the summer monsoon season for climate change applications

    NASA Astrophysics Data System (ADS)

    Prasanna, V.

    2018-01-01

    This study makes use of temperature and precipitation from CMIP5 climate model output for climate change application studies over the Indian region during the summer monsoon season (JJAS). Bias correction of temperature and precipitation from CMIP5 GCM simulation results with respect to observation is discussed in detail. The non-linear statistical bias correction is a suitable bias correction method for climate change data because it is simple and does not add up artificial uncertainties to the impact assessment of climate change scenarios for climate change application studies (agricultural production changes) in the future. The simple statistical bias correction uses observational constraints on the GCM baseline, and the projected results are scaled with respect to the changing magnitude in future scenarios, varying from one model to the other. Two types of bias correction techniques are shown here: (1) a simple bias correction using a percentile-based quantile-mapping algorithm and (2) a simple but improved bias correction method, a cumulative distribution function (CDF; Weibull distribution function)-based quantile-mapping algorithm. This study shows that the percentile-based quantile mapping method gives results similar to the CDF (Weibull)-based quantile mapping method, and both the methods are comparable. The bias correction is applied on temperature and precipitation variables for present climate and future projected data to make use of it in a simple statistical model to understand the future changes in crop production over the Indian region during the summer monsoon season. In total, 12 CMIP5 models are used for Historical (1901-2005), RCP4.5 (2005-2100), and RCP8.5 (2005-2100) scenarios. The climate index from each CMIP5 model and the observed agricultural yield index over the Indian region are used in a regression model to project the changes in the agricultural yield over India from RCP4.5 and RCP8.5 scenarios. The results revealed a better convergence of model projections in the bias corrected data compared to the uncorrected data. The study can be extended to localized regional domains aimed at understanding the changes in the agricultural productivity in the future with an agro-economy or a simple statistical model. The statistical model indicated that the total food grain yield is going to increase over the Indian region in the future, the increase in the total food grain yield is approximately 50 kg/ ha for the RCP4.5 scenario from 2001 until the end of 2100, and the increase in the total food grain yield is approximately 90 kg/ha for the RCP8.5 scenario from 2001 until the end of 2100. There are many studies using bias correction techniques, but this study applies the bias correction technique to future climate scenario data from CMIP5 models and applied it to crop statistics to find future crop yield changes over the Indian region.

  20. Detection of outliers in the response and explanatory variables of the simple circular regression model

    NASA Astrophysics Data System (ADS)

    Mahmood, Ehab A.; Rana, Sohel; Hussin, Abdul Ghapor; Midi, Habshah

    2016-06-01

    The circular regression model may contain one or more data points which appear to be peculiar or inconsistent with the main part of the model. This may be occur due to recording errors, sudden short events, sampling under abnormal conditions etc. The existence of these data points "outliers" in the data set cause lot of problems in the research results and the conclusions. Therefore, we should identify them before applying statistical analysis. In this article, we aim to propose a statistic to identify outliers in the both of the response and explanatory variables of the simple circular regression model. Our proposed statistic is robust circular distance RCDxy and it is justified by the three robust measurements such as proportion of detection outliers, masking and swamping rates.

  1. Use of iPhone technology in improving acetabular component position in total hip arthroplasty.

    PubMed

    Tay, Xiau Wei; Zhang, Benny Xu; Gayagay, George

    2017-09-01

    Improper acetabular cup positioning is associated with high risk of complications after total hip arthroplasty. The aim of our study is to objectively compare 3 methods, namely (1) free hand, (2) alignment jig (Sputnik), and (3) iPhone application to identify an easy, reproducible, and accurate method in improving acetabular cup placement. We designed a simple setup and carried out a simple experiment (see Method section). Using statistical analysis, the difference in inclination angles using iPhone application compared with the freehand method was found to be statistically significant ( F [2,51] = 4.17, P = .02) in the "untrained group". There is no statistical significance detected for the other groups. This suggests a potential role for iPhone applications in junior surgeons in overcoming the steep learning curve.

  2. WASP (Write a Scientific Paper) using Excel - 6: Standard error and confidence interval.

    PubMed

    Grech, Victor

    2018-03-01

    The calculation of descriptive statistics includes the calculation of standard error and confidence interval, an inevitable component of data analysis in inferential statistics. This paper provides pointers as to how to do this in Microsoft Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Education Statistics Quarterly, Fall 2002.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2003-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  4. BLS Machine-Readable Data and Tabulating Routines.

    ERIC Educational Resources Information Center

    DiFillipo, Tony

    This report describes the machine-readable data and tabulating routines that the Bureau of Labor Statistics (BLS) is prepared to distribute. An introduction discusses the LABSTAT (Labor Statistics) database and the BLS policy on release of unpublished data. Descriptions summarizing data stored in 25 files follow this format: overview, data…

  5. Education Statistics Quarterly, Fall 2001.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2001-01-01

    The publication gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message from…

  6. Evaluating Independent Proportions for Statistical Difference, Equivalence, Indeterminacy, and Trivial Difference Using Inferential Confidence Intervals

    ERIC Educational Resources Information Center

    Tryon, Warren W.; Lewis, Charles

    2009-01-01

    Tryon presented a graphic inferential confidence interval (ICI) approach to analyzing two independent and dependent means for statistical difference, equivalence, replication, indeterminacy, and trivial difference. Tryon and Lewis corrected the reduction factor used to adjust descriptive confidence intervals (DCIs) to create ICIs and introduced…

  7. Examples of Data Analysis with SPSS-X.

    ERIC Educational Resources Information Center

    MacFarland, Thomas W.

    Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics using SPSS-X Release 3.0 for VAX/UNIX. Statistical measures covered include Chi-square analysis; Spearman's rank correlation coefficient; Student's t-test with two independent samples; Student's t-test with a paired sample; One-way analysis…

  8. Statistics as Unbiased Estimators: Exploring the Teaching of Standard Deviation

    ERIC Educational Resources Information Center

    Wasserman, Nicholas H.; Casey, Stephanie; Champion, Joe; Huey, Maryann

    2017-01-01

    This manuscript presents findings from a study about the knowledge for and planned teaching of standard deviation. We investigate how understanding variance as an unbiased (inferential) estimator--not just a descriptive statistic for the variation (spread) in data--is related to teachers' instruction regarding standard deviation, particularly…

  9. Education Statistics Quarterly. Volume 5, Issue 1.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2003-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data product, and funding opportunities developed over a 3-month period. Each issue also contains a message…

  10. Education Statistics Quarterly, Winter 2001.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2002-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  11. 76 FR 60817 - Notice of Proposed Information Collection Requests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... Statistics (NCES) is seeking a three-year clearance for a new survey data collection for the College... most recent data are available. The clearance being requested is to survey the institutions on this... and sector specific findings from the CATE using descriptive statistics. The main cost areas showing...

  12. Basic Statistical Concepts and Methods for Earth Scientists

    USGS Publications Warehouse

    Olea, Ricardo A.

    2008-01-01

    INTRODUCTION Statistics is the science of collecting, analyzing, interpreting, modeling, and displaying masses of numerical data primarily for the characterization and understanding of incompletely known systems. Over the years, these objectives have lead to a fair amount of analytical work to achieve, substantiate, and guide descriptions and inferences.

  13. A weighted generalized score statistic for comparison of predictive values of diagnostic tests.

    PubMed

    Kosinski, Andrzej S

    2013-03-15

    Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations that are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we presented, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic that incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, always reduces to the score statistic in the independent samples situation, and preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe that the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the WGS test statistic in a general GEE setting. Copyright © 2012 John Wiley & Sons, Ltd.

  14. A weighted generalized score statistic for comparison of predictive values of diagnostic tests

    PubMed Central

    Kosinski, Andrzej S.

    2013-01-01

    Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations which are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we present, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic which incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, it always reduces to the score statistic in the independent samples situation, and it preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the weighted generalized score test statistic in a general GEE setting. PMID:22912343

  15. North Carolina Migrant Education Program. 1971 Project Evaluation Reports, Vol. I.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh.

    Evaluation reports for 10 of the 23 1971 Summer Migrant Projects in North Carolina are presented in Volume I of this compilation. Each report contains the following information: (1) descriptive statistics and results of student achievement; (2) description of the project as obtained from site team reports and other available information; and (3)…

  16. The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis

    ERIC Educational Resources Information Center

    Buri, Olga Elizabeth Minchala; Stefos, Efstathios

    2017-01-01

    The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…

  17. Policymakers Dependence on Evidence in Education Decision Making in Oyo State Ministry of Education

    ERIC Educational Resources Information Center

    Babalola, Joel B.; Gbolahan, Sowunmi

    2016-01-01

    This study investigated policymaker dependence on evidence in education decision making in Oyo State Ministry of Education. The study was conducted under a descriptive survey design, 44 out of the 290 policymakers of the Ministry and Board of Education across the State were purposively selected for the study. Descriptive statistics of frequency…

  18. Correlation and simple linear regression.

    PubMed

    Zou, Kelly H; Tuncali, Kemal; Silverman, Stuart G

    2003-06-01

    In this tutorial article, the concepts of correlation and regression are reviewed and demonstrated. The authors review and compare two correlation coefficients, the Pearson correlation coefficient and the Spearman rho, for measuring linear and nonlinear relationships between two continuous variables. In the case of measuring the linear relationship between a predictor and an outcome variable, simple linear regression analysis is conducted. These statistical concepts are illustrated by using a data set from published literature to assess a computed tomography-guided interventional technique. These statistical methods are important for exploring the relationships between variables and can be applied to many radiologic studies.

  19. Gingival enlargement in different age groups during fixed Orthodontic treatment

    PubMed Central

    Eid, Hossam A; Assiri, Hassan Ahmed M; Kandyala, Reena; Togoo, Rafi A; Turakhia, Viral S

    2014-01-01

    Background: During fixed orthodontic therapy, adolescents tend to have higher chances of gingivitis and gingival enlargement (GE) compared to adults. A cross sectional study was undertaken to evaluate the above hypothesis, by assessing GE in patients of different age groups receiving fixed orthodontic therapy. Materials & Methods: Patients undergoing orthodontic treatment were selected by simple random sampling from the King Khalid University College of Dentistry out patient’s clinic of preventive dental sciences division to form the study group. Participant’s were divided into three age groups and GE was graded as 0, 1 and 2 as per the classification of the American Academy of Periodontology. Data were analyzed by using IBM SPSS version 16.0 (Statistical Package for Social Services, Chicago, IL, USA) and descriptive statistics were obtained. Differences in proportions were compared using the Chi-square test and the significance level was set at p ≤ 0.05. Results: 62.3% (n=33) were males and 37.7% (n=20) were females. Group 1 had 21 patients (39.7%); Group 2 had 24 patients (45.3%) and Group 3 had 8 patients (15.1%).The highest frequency (48%) of GE was observed among the Group 1 age group (10-19 years). Differences in frequency of GE according to age groups were found to be statistically significant (p=0.046).Differences in GE according to the frequency of practicing oral hygiene measures were statistically significant (p<0.001). Conclusion: Highest frequency of GE was observed among the adolescents. The patients who practiced oral hygiene measures more than three times daily did not have any GE. On the other hand, those who brushed and flossed only once daily had the highest percentage of grade 2 GE. How to cite the article: Eid HA, Assiri HA, Kandyala R, Togoo RA, Turakhia VS. Gingival enlargement in different age groups during fixed Orthodontic treatment. J Int Oral Health 2014;6(1):1-4. PMID:24653595

  20. Analysis of statistical misconception in terms of statistical reasoning

    NASA Astrophysics Data System (ADS)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  1. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    NASA Astrophysics Data System (ADS)

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  2. Designing an Error Resolution Checklist for a Shared Manned-Unmanned Environment

    DTIC Science & Technology

    2010-06-01

    performance during the Olympics. Thank you to Birsen Donmez, who took an active role in my statistics instruction. I appreciate your time and patience...in teaching me the finer details of “varsity statistics ”. Also, thank you for being so responsive through e-mail, even though you are now located in...105! 6.3.! Experiment recommendations and future work................................................ 105! Appendix A: Descriptive Statistics

  3. Reframing Serial Murder Within Empirical Research.

    PubMed

    Gurian, Elizabeth A

    2017-04-01

    Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.

  4. Experimental observations of Lagrangian sand grain kinematics under bedload transport: statistical description of the step and rest regimes

    NASA Astrophysics Data System (ADS)

    Guala, M.; Liu, M.

    2017-12-01

    The kinematics of sediment particles is investigated by non-intrusive imaging methods to provide a statistical description of bedload transport in conditions near the threshold of motion. In particular, we focus on the cyclic transition between motion and rest regimes to quantify the waiting time statistics inferred to be responsible for anomalous diffusion, and so far elusive. Despite obvious limitations in the spatio-temporal domain of the observations, we are able to identify the probability distributions of the particle step time and length, velocity, acceleration, waiting time, and thus distinguish which quantities exhibit well converged mean values, based on the thickness of their respective tails. The experimental results shown here for four different transport conditions highlight the importance of the waiting time distribution and represent a benchmark dataset for the stochastic modeling of bedload transport.

  5. Transition from Poissonian to Gaussian-orthogonal-ensemble level statistics in a modified Artin's billiard

    NASA Astrophysics Data System (ADS)

    Csordás, A.; Graham, R.; Szépfalusy, P.; Vattay, G.

    1994-01-01

    One wall of an Artin's billiard on the Poincaré half-plane is replaced by a one-parameter (cp) family of nongeodetic walls. A brief description of the classical phase space of this system is given. In the quantum domain, the continuous and gradual transition from the Poisson-like to Gaussian-orthogonal-ensemble (GOE) level statistics due to the small perturbations breaking the symmetry responsible for the ``arithmetic chaos'' at cp=1 is studied. Another GOE-->Poisson transition due to the mixed phase space for large perturbations is also investigated. A satisfactory description of the intermediate level statistics by the Brody distribution was found in both cases. The study supports the existence of a scaling region around cp=1. A finite-size scaling relation for the Brody parameter as a function of 1-cp and the number of levels considered can be established.

  6. Experiences in Space Science.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC. Educational Programs Div.

    This publication contains descriptions of space science activities that can be conducted with simple equipment. There are activities suitable for both elementary and secondary school children. Activities are placed under the headings: Astronomy, Atmosphere, Universal Gravitation, Aerodynamics, Guidance and Propulsion, Tracking and Communications,…

  7. On electro-hydrodynamic effects over liquids under influence of corona discharge

    NASA Astrophysics Data System (ADS)

    Bychkov, V. L.; Abakumov, V. I.; Bikmukhametova, A. R.; Chernikov, V. A.; Safronenkov, D. A.

    2018-03-01

    Electrohydrodynamic effects over liquids under high voltage electrode are considered in experiments with corona discharge. Simple theory is applied for description of a funnel appearance over a liquid is presented. New types of electrohydrodynamic instabilities are revealed.

  8. The A-B-C of Desalting.

    ERIC Educational Resources Information Center

    Department of the Interior, Washington, DC. Office of Water Research and Technology.

    This publication provides a simple explanation of how various processes convert sea or brackish water to fresh water. Included are descriptions of the membrane processes (reverse osmosis, electrodialysis, transport depletion, and piezodialysis); the distillation processes (multistage flash distillation, vertical tube distillation, multieffect…

  9. Entropy Is Simple, Qualitatively.

    ERIC Educational Resources Information Center

    Lambert, Frank L.

    2002-01-01

    Suggests that qualitatively, entropy is simple. Entropy increase from a macro viewpoint is a measure of the dispersal of energy from localized to spread out at a temperature T. Fundamentally based on statistical and quantum mechanics, this approach is superior to the non-fundamental "disorder" as a descriptor of entropy change. (MM)

  10. α -induced reactions on 115In: Cross section measurements and statistical model analysis

    NASA Astrophysics Data System (ADS)

    Kiss, G. G.; Szücs, T.; Mohr, P.; Török, Zs.; Huszánk, R.; Gyürky, Gy.; Fülöp, Zs.

    2018-05-01

    Background: α -nucleus optical potentials are basic ingredients of statistical model calculations used in nucleosynthesis simulations. While the nucleon+nucleus optical potential is fairly well known, for the α +nucleus optical potential several different parameter sets exist and large deviations, reaching sometimes even an order of magnitude, are found between the cross section predictions calculated using different parameter sets. Purpose: A measurement of the radiative α -capture and the α -induced reaction cross sections on the nucleus 115In at low energies allows a stringent test of statistical model predictions. Since experimental data are scarce in this mass region, this measurement can be an important input to test the global applicability of α +nucleus optical model potentials and further ingredients of the statistical model. Methods: The reaction cross sections were measured by means of the activation method. The produced activities were determined by off-line detection of the γ rays and characteristic x rays emitted during the electron capture decay of the produced Sb isotopes. The 115In(α ,γ )119Sb and 115In(α ,n )Sb118m reaction cross sections were measured between Ec .m .=8.83 and 15.58 MeV, and the 115In(α ,n )Sb118g reaction was studied between Ec .m .=11.10 and 15.58 MeV. The theoretical analysis was performed within the statistical model. Results: The simultaneous measurement of the (α ,γ ) and (α ,n ) cross sections allowed us to determine a best-fit combination of all parameters for the statistical model. The α +nucleus optical potential is identified as the most important input for the statistical model. The best fit is obtained for the new Atomki-V1 potential, and good reproduction of the experimental data is also achieved for the first version of the Demetriou potentials and the simple McFadden-Satchler potential. The nucleon optical potential, the γ -ray strength function, and the level density parametrization are also constrained by the data although there is no unique best-fit combination. Conclusions: The best-fit calculations allow us to extrapolate the low-energy (α ,γ ) cross section of 115In to the astrophysical Gamow window with reasonable uncertainties. However, still further improvements of the α -nucleus potential are required for a global description of elastic (α ,α ) scattering and α -induced reactions in a wide range of masses and energies.

  11. Preliminary study of the effect of the turbulent flow field around complex surfaces on their acoustic characteristics

    NASA Technical Reports Server (NTRS)

    Olsen, W. A.; Boldman, D.

    1978-01-01

    Fairly extensive measurements have been conducted of the turbulent flow around various surfaces as a basis for a study of the acoustic characteristics involved. In the experiments the flow from a nozzle was directed upon various two-dimensional surface configurations such as the three-flap model. A turbulent flow field description is given and an estimate of the acoustic characteristics is provided. The developed equations are based upon fundamental theories for simple configurations having simple flows. Qualitative estimates are obtained regarding the radiation pattern and the velocity power law. The effect of geometry and turbulent flow distribution on the acoustic emission from simple configurations are discussed.

  12. Four simple ocean carbon models

    NASA Technical Reports Server (NTRS)

    Moore, Berrien, III

    1992-01-01

    This paper briefly reviews the key processes that determine oceanic CO2 uptake and sets this description within the context of four simple ocean carbon models. These models capture, in varying degrees, these key processes and establish a clear foundation for more realistic models that incorporate more directly the underlying physics and biology of the ocean rather than relying on simple parametric schemes. The purpose of this paper is more pedagogical than purely scientific. The problems encountered by current attempts to understand the global carbon cycle not only require our efforts but set a demand for a new generation of scientist, and it is hoped that this paper and the text in which it appears will help in this development.

  13. Stick or Switch: A Selection Heuristic Predicts when People Take the Perspective of Others or Communicate Egocentrically.

    PubMed

    Rogers, Shane L; Fay, Nicolas

    2016-01-01

    This paper examines a cognitive mechanism that drives perspective-taking and egocentrism in interpersonal communication. Using a conceptual referential communication task, in which participants describe a range of abstract geometric shapes, Experiment 1 shows that perspective-taking and egocentric communication are frequent communication strategies. Experiment 2 tests a selection heuristic account of perspective-taking and egocentric communication. It uses participants' shape description ratings to predict their communication strategy. Participants' communication strategy was predicted by how informative they perceived the different shape descriptions to be. When participants' personal shape description was perceived to be more informative than their addressee's shape description, there was a strong bias to communicate egocentrically. By contrast, when their addressee's shape description was perceived to be more informative, there was a strong bias to take their addressee's perspective. When the shape descriptions were perceived to be equally informative, there was a moderate bias to communicate egocentrically. This simple, but powerful, selection heuristic may be critical to the cumulative cultural evolution of human communication systems, and cumulative cultural evolution more generally.

  14. Iterative channel decoding of FEC-based multiple-description codes.

    PubMed

    Chang, Seok-Ho; Cosman, Pamela C; Milstein, Laurence B

    2012-03-01

    Multiple description coding has been receiving attention as a robust transmission framework for multimedia services. This paper studies the iterative decoding of FEC-based multiple description codes. The proposed decoding algorithms take advantage of the error detection capability of Reed-Solomon (RS) erasure codes. The information of correctly decoded RS codewords is exploited to enhance the error correction capability of the Viterbi algorithm at the next iteration of decoding. In the proposed algorithm, an intradescription interleaver is synergistically combined with the iterative decoder. The interleaver does not affect the performance of noniterative decoding but greatly enhances the performance when the system is iteratively decoded. We also address the optimal allocation of RS parity symbols for unequal error protection. For the optimal allocation in iterative decoding, we derive mathematical equations from which the probability distributions of description erasures can be generated in a simple way. The performance of the algorithm is evaluated over an orthogonal frequency-division multiplexing system. The results show that the performance of the multiple description codes is significantly enhanced.

  15. [Methodology of the description of atmospheric air pollution by nitrogen dioxide by land use regression method in Ekaterinburg].

    PubMed

    Antropov, K M; Varaksin, A N

    2013-01-01

    This paper provides the description of Land Use Regression (LUR) modeling and the result of its application in the study of nitrogen dioxide air pollution in Ekaterinburg. The paper describes the difficulties of the modeling for air pollution caused by motor vehicles exhaust, and the ways to address these challenges. To create LUR model of the NO2 air pollution in Ekaterinburg, concentrations of NO2 were measured, data on factors affecting air pollution were collected, a statistical analysis of the data were held. A statistical model of NO2 air pollution (coefficient of determination R2 = 0.70) and a map of pollution were created.

  16. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  17. The influence of socio cultural dynamics on convergence communication of aquaculture agribusiness actors

    NASA Astrophysics Data System (ADS)

    Oktavia, Y.

    2018-03-01

    This research aims to: (1) Analyze the level of socio-cultural dynamics of agibusiness aquaculture actors. (2) Analyze the influence of socio-cultural dynamics on convergence communication of capacity development of aquaculture agribusiness actors.Data was collected by questionnaire and interview of group members on agribusiness. Data analyze was done by descriptive and inferential statistics with using SEM method. The result of descriptive statistics on 284 agribusiness members showed that: Socio-cultural dynamics of agibusiness aquaculture actors was in low category, as shown by lack of the role of customary institutions and quality of local leadership.The communication convergence is significantly and positively influenced by the communication behavior of agribusiness actors in access information.

  18. Simple Statistics: - Summarized!

    ERIC Educational Resources Information Center

    Blai, Boris, Jr.

    Statistics are an essential tool for making proper judgement decisions. It is concerned with probability distribution models, testing of hypotheses, significance tests and other means of determining the correctness of deductions and the most likely outcome of decisions. Measures of central tendency include the mean, median and mode. A second…

  19. Contrast Analysis: A Tutorial

    ERIC Educational Resources Information Center

    Haans, Antal

    2018-01-01

    Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient manner in many statistical software packages. This…

  20. Superordinate Shape Classification Using Natural Shape Statistics

    ERIC Educational Resources Information Center

    Wilder, John; Feldman, Jacob; Singh, Manish

    2011-01-01

    This paper investigates the classification of shapes into broad natural categories such as "animal" or "leaf". We asked whether such coarse classifications can be achieved by a simple statistical classification of the shape skeleton. We surveyed databases of natural shapes, extracting shape skeletons and tabulating their…

Top