Science.gov

Sample records for easily interpretable statistics

  1. Using an algorithm to easily interpret basic cardiac rhythms.

    PubMed

    Atwood, Denise

    2005-11-01

    MANY NURSES STRUGGLE with identifying electrocardiogram (ECG) rhythms, but rapidly interpreting primary ECG rhythms is an essential skill that every nurse should master. THIS ARTICLE PROVIDES an algorithm that nurses can use to easily interpret basic ECG rhythms.

  2. Equivalent statistics and data interpretation.

    PubMed

    Francis, Gregory

    2016-10-14

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  3. Statistical interpretation of traveltime fluctuations

    NASA Astrophysics Data System (ADS)

    Roth, Michael

    1997-02-01

    A ray-theoretical relation between the autocorrelation functions of traveltime and slowness fluctuations is established for recording profiles with arbitrary angles to the propagation direction of a plane wave. From this relation follows that the variance of traveltime fluctuations is independent of the profile orientation and proportional to the variance, ɛ2, of slowness fluctuations, to the correlation distance, a, and to the propagation distance L. The halfwidth of the autocorrelation function of traveltime fluctuations is proportional to a and decreases with increasing profile angle. This relationship allows us to estimate the statistical parameters ɛ and a from observed traveltime fluctuations. Numerical experiments for spatial isotropic random media characterized by a Gaussian autocorrelation function show that the statistical parameters can be reproduced successfully if L/a ≤ 10 . For larger L/a the correlation distance is overestimated and the standard deviation is underestimated. However, the results of the numerical experiments provide empirical factors to correct for these effects. The theory is applied to observed traveltime fluctuations of the Pg phase on a profile of the BABEL project. For the upper crust east of Øland (Sweden) slowness fluctuations with standard deviation ɛ = 2.2-5% and correlation distance a = 330-600 m are found.

  4. Local statistical interpretation for water structure

    NASA Astrophysics Data System (ADS)

    Sun, Qiang

    2013-05-01

    In this Letter, Raman spectroscopy is employed to study supercooled water down to a temperature of 248 K at ambient pressure. Based on our interpretation of the Raman OH stretching band, decreasing temperature mainly leads to a structural transition from the single donor-single acceptor (DA) to the double donor-double acceptor (DDAA) hydrogen bonding motif. Additionally, a local statistical interpretation of the water structure is proposed, which reveals that a water molecule interacts with molecules in the first shell through various local hydrogen-bonded networks. From this, a local structure order parameter is proposed to explain the short-range order and long-range disorder.

  5. Interpreting statistics of small lunar craters

    NASA Technical Reports Server (NTRS)

    Schultz, P. H.; Gault, D.; Greeley, R.

    1977-01-01

    Some of the wide variations in the crater-size distributions in lunar photography and in the resulting statistics were interpreted as different degradation rates on different surfaces, different scaling laws in different targets, and a possible population of endogenic craters. These possibilities are reexamined for statistics of 26 different regions. In contrast to most other studies, crater diameters as small as 5 m were measured from enlarged Lunar Orbiter framelets. According to the results of the reported analysis, the different crater distribution types appear to be most consistent with the hypotheses of differential degradation and a superposed crater population. Differential degradation can account for the low level of equilibrium in incompetent materials such as ejecta deposits, mantle deposits, and deep regoliths where scaling law changes and catastrophic processes introduce contradictions with other observations.

  6. Analysis of variance is easily misapplied in the analysis of randomized trials: a critique and discussion of alternative statistical approaches.

    PubMed

    Vickers, Andrew J

    2005-01-01

    Analysis of variance (ANOVA) is a statistical method that is widely used in the psychosomatic literature to analyze the results of randomized trials, yet ANOVA does not provide an estimate for the difference between groups, the key variable of interest in a randomized trial. Although the use of ANOVA is frequently justified on the grounds that a trial incorporates more than two groups, the hypothesis tested by ANOVA for these trials--"Are all groups equivalent?"--is often scientifically uninteresting. Regression methods are not only applicable to trials with many groups, but can be designed to address specific questions arising from the study design. ANOVA is also frequently used for trials with repeated measures, but the consequent reporting of "group effects," "time effects," and "time-by-group interactions" is a distraction from statistics of clinical and scientific value. Given that ANOVA is easily misapplied in the analysis of randomized trials, alternative approaches such as regression methods should be considered in preference.

  7. The Statistical Interpretation of Entropy: An Activity

    ERIC Educational Resources Information Center

    Timmberlake, Todd

    2010-01-01

    The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…

  8. Interpretation and use of statistics in nursing research.

    PubMed

    Giuliano, Karen K; Polanowicz, Michelle

    2008-01-01

    A working understanding of the major fundamentals of statistical analysis is required to incorporate the findings of empirical research into nursing practice. The primary focus of this article is to describe common statistical terms, present some common statistical tests, and explain the interpretation of results from inferential statistics in nursing research. An overview of major concepts in statistics, including the distinction between parametric and nonparametric statistics, different types of data, and the interpretation of statistical significance, is reviewed. Examples of some of the most common statistical techniques used in nursing research, such as the Student independent t test, analysis of variance, and regression, are also discussed. Nursing knowledge based on empirical research plays a fundamental role in the development of evidence-based nursing practice. The ability to interpret and use quantitative findings from nursing research is an essential skill for advanced practice nurses to ensure provision of the best care possible for our patients.

  9. The Statistical Interpretation of Entropy: An Activity

    NASA Astrophysics Data System (ADS)

    Timmberlake, Todd

    2010-11-01

    The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the functioning of the second law and also provided evidence for the existence of atoms at a time when many scientists (like Ernst Mach and Wilhelm Ostwald) were skeptical.

  10. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  11. Use and interpretation of statistics in wildlife journals

    USGS Publications Warehouse

    Tacha, Thomas C.; Warde, William D.; Burnham, Kenneth P.

    1982-01-01

    Use and interpretation of statistics in wildlife journals are reviewed, and suggestions for improvement are offered. Populations from which inferences are to be drawn should be clearly defined, and conclusions should be limited to the range of the data analyzed. Authors should be careful to avoid improper methods of plotting data and should clearly define the use of estimates of variance, standard deviation, standard error, or confidence intervals. Biological and statistical significant are often confused by authors and readers. Statistical hypothesis testing is a tool, and not every question should be answered by hypothesis testing. Meeting assumptions of hypothesis tests is the responsibility of authors, and assumptions should be reviewed before a test is employed. The use of statistical tools should be considered carefully both before and after gathering data.

  12. Comparing survival curves using an easy to interpret statistic.

    PubMed

    Hess, Kenneth R

    2010-10-15

    Here, I describe a statistic for comparing two survival curves that has a clear and obvious meaning and has a long history in biostatistics. Suppose we are comparing survival times associated with two treatments A and B. The statistic operates in such a way that if it takes on the value 0.95, then the interpretation is that a randomly chosen patient treated with A has a 95% chance of surviving longer than a randomly chosen patient treated with B. This statistic was first described in the 1950s, and was generalized in the 1960s to work with right-censored survival times. It is a useful and convenient measure for assessing differences between survival curves. Software for computing the statistic is readily available on the Internet.

  13. Statistical Interpretation of Natural and Technological Hazards in China

    NASA Astrophysics Data System (ADS)

    Borthwick, Alistair, ,, Prof.; Ni, Jinren, ,, Prof.

    2010-05-01

    China is prone to catastrophic natural hazards from floods, droughts, earthquakes, storms, cyclones, landslides, epidemics, extreme temperatures, forest fires, avalanches, and even tsunami. This paper will list statistics related to the six worst natural disasters in China over the past 100 or so years, ranked according to number of fatalities. The corresponding data for the six worst natural disasters in China over the past decade will also be considered. [The data are abstracted from the International Disaster Database, Centre for Research on the Epidemiology of Disasters (CRED), Université Catholique de Louvain, Brussels, Belgium, http://www.cred.be/ where a disaster is defined as occurring if one of the following criteria is fulfilled: 10 or more people reported killed; 100 or more people reported affected; a call for international assistance; or declaration of a state of emergency.] The statistics include the number of occurrences of each type of natural disaster, the number of deaths, the number of people affected, and the cost in billions of US dollars. Over the past hundred years, the largest disasters may be related to the overabundance or scarcity of water, and to earthquake damage. However, there has been a substantial relative reduction in fatalities due to water related disasters over the past decade, even though the overall numbers of people affected remain huge, as does the economic damage. This change is largely due to the efforts put in by China's water authorities to establish effective early warning systems, the construction of engineering countermeasures for flood protection, the implementation of water pricing and other measures for reducing excessive consumption during times of drought. It should be noted that the dreadful death toll due to the Sichuan Earthquake dominates recent data. Joint research has been undertaken between the Department of Environmental Engineering at Peking University and the Department of Engineering Science at Oxford

  14. Workplace Statistical Literacy for Teachers: Interpreting Box Plots

    ERIC Educational Resources Information Center

    Pierce, Robyn; Chick, Helen

    2013-01-01

    As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…

  15. Confounded Statistical Analyses Hinder Interpretation of the NELP Report

    ERIC Educational Resources Information Center

    Paris, Scott G.; Luo, Serena Wenshu

    2010-01-01

    The National Early Literacy Panel (2008) report identified early predictors of reading achievement as good targets for instruction, and many of those skills are related to decoding. In this article, the authors suggest that the developmental trajectories of rapidly developing skills pose problems for traditional statistical analyses. Rapidly…

  16. Interpretation of Statistical Significance Testing: A Matter of Perspective.

    ERIC Educational Resources Information Center

    McClure, John; Suen, Hoi K.

    1994-01-01

    This article compares three models that have been the foundation for approaches to the analysis of statistical significance in early childhood research--the Fisherian and the Neyman-Pearson models (both considered "classical" approaches), and the Bayesian model. The article concludes that all three models have a place in the analysis of research…

  17. Statistical characteristics of MST radar echoes and its interpretation

    NASA Technical Reports Server (NTRS)

    Woodman, Ronald F.

    1989-01-01

    Two concepts of fundamental importance are reviewed: the autocorrelation function and the frequency power spectrum. In addition, some turbulence concepts, the relationship between radar signals and atmospheric medium statistics, partial reflection, and the characteristics of noise and clutter interference are discussed.

  18. Evaluating bifactor models: Calculating and interpreting statistical indices.

    PubMed

    Rodriguez, Anthony; Reise, Steven P; Haviland, Mark G

    2016-06-01

    Bifactor measurement models are increasingly being applied to personality and psychopathology measures (Reise, 2012). In this work, authors generally have emphasized model fit, and their typical conclusion is that a bifactor model provides a superior fit relative to alternative subordinate models. Often unexplored, however, are important statistical indices that can substantially improve the psychometric analysis of a measure. We provide a review of the particularly valuable statistical indices one can derive from bifactor models. They include omega reliability coefficients, factor determinacy, construct reliability, explained common variance, and percentage of uncontaminated correlations. We describe how these indices can be calculated and used to inform: (a) the quality of unit-weighted total and subscale score composites, as well as factor score estimates, and (b) the specification and quality of a measurement model in structural equation modeling. (PsycINFO Database Record

  19. Variation in reaction norms: Statistical considerations and biological interpretation.

    PubMed

    Morrissey, Michael B; Liefting, Maartje

    2016-09-01

    Analysis of reaction norms, the functions by which the phenotype produced by a given genotype depends on the environment, is critical to studying many aspects of phenotypic evolution. Different techniques are available for quantifying different aspects of reaction norm variation. We examine what biological inferences can be drawn from some of the more readily applicable analyses for studying reaction norms. We adopt a strongly biologically motivated view, but draw on statistical theory to highlight strengths and drawbacks of different techniques. In particular, consideration of some formal statistical theory leads to revision of some recently, and forcefully, advocated opinions on reaction norm analysis. We clarify what simple analysis of the slope between mean phenotype in two environments can tell us about reaction norms, explore the conditions under which polynomial regression can provide robust inferences about reaction norm shape, and explore how different existing approaches may be used to draw inferences about variation in reaction norm shape. We show how mixed model-based approaches can provide more robust inferences than more commonly used multistep statistical approaches, and derive new metrics of the relative importance of variation in reaction norm intercepts, slopes, and curvatures.

  20. Interpreting the flock algorithm from a statistical perspective.

    PubMed

    Anderson, Eric C; Barry, Patrick D

    2015-09-01

    We show that the algorithm in the program flock (Duchesne & Turgeon 2009) can be interpreted as an estimation procedure based on a model essentially identical to the structure (Pritchard et al. 2000) model with no admixture and without correlated allele frequency priors. Rather than using MCMC, the flock algorithm searches for the maximum a posteriori estimate of this structure model via a simulated annealing algorithm with a rapid cooling schedule (namely, the exponent on the objective function →∞). We demonstrate the similarities between the two programs in a two-step approach. First, to enable rapid batch processing of many simulated data sets, we modified the source code of structure to use the flock algorithm, producing the program flockture. With simulated data, we confirmed that results obtained with flock and flockture are very similar (though flockture is some 200 times faster). Second, we simulated multiple large data sets under varying levels of population differentiation for both microsatellite and SNP genotypes. We analysed them with flockture and structure and assessed each program on its ability to cluster individuals to their correct subpopulation. We show that flockture yields results similar to structure albeit with greater variability from run to run. flockture did perform better than structure when genotypes were composed of SNPs and differentiation was moderate (FST= 0.022-0.032). When differentiation was low, structure outperformed flockture for both marker types. On large data sets like those we simulated, it appears that flock's reliance on inference rules regarding its 'plateau record' is not helpful. Interpreting flock's algorithm as a special case of the model in structure should aid in understanding the program's output and behaviour.

  1. A statistical model for interpreting computerized dynamic posturography data

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.

    2002-01-01

    Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.

  2. Report: New analytical and statistical approaches for interpreting the relationships among environmental stressors and biomarkers

    EPA Science Inventory

    The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical, and statistical perspectives (Pleil et al. 2014; Sobus et al. 2011...

  3. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    ERIC Educational Resources Information Center

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  4. New physicochemical interpretations for the adsorption of food dyes on chitosan films using statistical physics treatment.

    PubMed

    Dotto, G L; Pinto, L A A; Hachicha, M A; Knani, S

    2015-03-15

    In this work, statistical physics treatment was employed to study the adsorption of food dyes onto chitosan films, in order to obtain new physicochemical interpretations at molecular level. Experimental equilibrium curves were obtained for the adsorption of four dyes (FD&C red 2, FD&C yellow 5, FD&C blue 2, Acid Red 51) at different temperatures (298, 313 and 328 K). A statistical physics formula was used to interpret these curves, and the parameters such as, number of adsorbed dye molecules per site (n), anchorage number (n'), receptor sites density (NM), adsorbed quantity at saturation (N asat), steric hindrance (τ), concentration at half saturation (c1/2) and molar adsorption energy (ΔE(a)) were estimated. The relation of the above mentioned parameters with the chemical structure of the dyes and temperature was evaluated and interpreted.

  5. On the Interpretation of Running Trends as Summary Statistics for Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Vigo, Isabel M.; Trottini, Mario; Belda, Santiago

    2016-04-01

    In recent years, running trends analysis (RTA) has been widely used in climate applied research as summary statistics for time series analysis. There is no doubt that RTA might be a useful descriptive tool, but despite its general use in applied research, precisely what it reveals about the underlying time series is unclear and, as a result, its interpretation is unclear too. This work contributes to such interpretation in two ways: 1) an explicit formula is obtained for the set of time series with a given series of running trends, making it possible to show that running trends, alone, perform very poorly as summary statistics for time series analysis; and 2) an equivalence is established between RTA and the estimation of a (possibly nonlinear) trend component of the underlying time series using a weighted moving average filter. Such equivalence provides a solid ground for RTA implementation and interpretation/validation.

  6. Correlation-based interpretations of paleoclimate data - where statistics meet past climates

    NASA Astrophysics Data System (ADS)

    Hu, Jun; Emile-Geay, Julien; Partin, Judson

    2017-02-01

    Correlation analysis is omnipresent in paleoclimatology, and often serves to support the proposed climatic interpretation of a given proxy record. However, this analysis presents several statistical challenges, each of which is sufficient to nullify the interpretation: the loss of degrees of freedom due to serial correlation, the test multiplicity problem in connection with a climate field, and the presence of age uncertainties. While these issues have long been known to statisticians, they are not widely appreciated by the wider paleoclimate community; yet they can have a first-order impact on scientific conclusions. Here we use three examples from the recent paleoclimate literature to highlight how spurious correlations affect the published interpretations of paleoclimate proxies, and suggest that future studies should address these issues to strengthen their conclusions. In some cases, correlations that were previously claimed to be significant are found insignificant, thereby challenging published interpretations. In other cases, minor adjustments can be made to safeguard against these concerns. Because such problems arise so commonly with paleoclimate data, we provide open-source code to address them. Ultimately, we conclude that statistics alone cannot ground-truth a proxy, and recommend establishing a mechanistic understanding of a proxy signal as a sounder basis for interpretation.

  7. Alternative interpretations of statistics on health effects of low-level radiation

    SciTech Connect

    Hamilton, L.D.

    1983-11-01

    Four examples of the interpretation of statistics of data on low-level radiation are reviewed: (a) genetic effects of the atomic bombs at Hiroshima and Nagasaki, (b) cancer at Rocky Flats, (c) childhood leukemia and fallout in Utah, and (d) cancer among workers at the Portsmouth Naval Shipyard. Aggregation of data, adjustment for age, and other problems related to the determination of health effects of low-level radiation are discussed. Troublesome issues related to post hoc analysis are considered.

  8. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  9. Two Easily Made Astronomical Telescopes.

    ERIC Educational Resources Information Center

    Hill, M.; Jacobs, D. J.

    1991-01-01

    The directions and diagrams for making a reflecting telescope and a refracting telescope are presented. These telescopes can be made by students out of plumbing parts and easily obtainable, inexpensive, optical components. (KR)

  10. An Easily Constructed Dodecahedron Model.

    ERIC Educational Resources Information Center

    Yamana, Shukichi

    1984-01-01

    A model of a dodecahedron which is necessary for teaching stereochemistry (for example, that of dodecahedrane) can be made easily by using a sealed, empty envelope. The steps necessary for accomplishing this task are presented. (JN)

  11. An Easily Constructed Cube Model.

    ERIC Educational Resources Information Center

    Yamana, Shukichi; Kawaguchi, Makoto

    1984-01-01

    A model of a cube which is necessary for teaching stereochemistry (especially of inorganic compounds) can be made easily, by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  12. A t-statistic for objective interpretation of comparative genomic hybridization (CGH) profiles.

    PubMed

    Moore, D H; Pallavicini, M; Cher, M L; Gray, J W

    1997-07-01

    An objective method for interpreting comparative genomic hybridization (CGH) is described and compared with current methods of interpretation. The method is based on a two-sample t-statistic in which composite test:reference and reference:reference CGH profiles are compared at each point along the genome to detect regions of significant differences. Composite profiles are created by combining CGH profiles measured from several metaphase chromosomes for each type of chromosome in the normal human karyotype. Composites for both test:reference and reference:reference CGH analyses are used to generate mean CGH profiles and information about the variance therein. The utility of the method is demonstrated through analysis of aneusomies and partial gain and loss of DNA sequence in a myeloid leukemia specimen. Banding analyses of this specimen indicated inv (3)(q21q26), del (5)(q2?q35), -7, +8 and add (17)(p11.2). The t-statistic analyses of CGH data indicated rev ish enh (8) and rev ish dim (5q31.1q33.1,7q11.23qter). The undetected gain on 17p was small and confined to a single band (17p11.2). Thus, the t-statistic is an objective and effective method for defining significant differences between test and reference CGH profiles.

  13. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  14. Misuse of statistics in the interpretation of data on low-level radiation

    SciTech Connect

    Hamilton, L.D.

    1982-01-01

    Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.

  15. The null hypothesis significance test in health sciences research (1995-2006): statistical analysis and interpretation

    PubMed Central

    2010-01-01

    Background The null hypothesis significance test (NHST) is the most frequently used statistical method, although its inferential validity has been widely criticized since its introduction. In 1988, the International Committee of Medical Journal Editors (ICMJE) warned against sole reliance on NHST to substantiate study conclusions and suggested supplementary use of confidence intervals (CI). Our objective was to evaluate the extent and quality in the use of NHST and CI, both in English and Spanish language biomedical publications between 1995 and 2006, taking into account the International Committee of Medical Journal Editors recommendations, with particular focus on the accuracy of the interpretation of statistical significance and the validity of conclusions. Methods Original articles published in three English and three Spanish biomedical journals in three fields (General Medicine, Clinical Specialties and Epidemiology - Public Health) were considered for this study. Papers published in 1995-1996, 2000-2001, and 2005-2006 were selected through a systematic sampling method. After excluding the purely descriptive and theoretical articles, analytic studies were evaluated for their use of NHST with P-values and/or CI for interpretation of statistical "significance" and "relevance" in study conclusions. Results Among 1,043 original papers, 874 were selected for detailed review. The exclusive use of P-values was less frequent in English language publications as well as in Public Health journals; overall such use decreased from 41% in 1995-1996 to 21% in 2005-2006. While the use of CI increased over time, the "significance fallacy" (to equate statistical and substantive significance) appeared very often, mainly in journals devoted to clinical specialties (81%). In papers originally written in English and Spanish, 15% and 10%, respectively, mentioned statistical significance in their conclusions. Conclusions Overall, results of our review show some improvements in

  16. Soil VisNIR chemometric performance statistics should be interpreted as random variables

    NASA Astrophysics Data System (ADS)

    Brown, David J.; Gasch, Caley K.; Poggio, Matteo; Morgan, Cristine L. S.

    2015-04-01

    Chemometric models are normally evaluated using performance statistics such as the Standard Error of Prediction (SEP) or the Root Mean Squared Error of Prediction (RMSEP). These statistics are used to evaluate the quality of chemometric models relative to other published work on a specific soil property or to compare the results from different processing and modeling techniques (e.g. Partial Least Squares Regression or PLSR and random forest algorithms). Claims are commonly made about the overall success of an application or the relative performance of different modeling approaches assuming that these performance statistics are fixed population parameters. While most researchers would acknowledge that small differences in performance statistics are not important, rarely are performance statistics treated as random variables. Given that we are usually comparing modeling approaches for general application, and given that the intent of VisNIR soil spectroscopy is to apply chemometric calibrations to larger populations than are included in our soil-spectral datasets, it is more appropriate to think of performance statistics as random variables with variation introduced through the selection of samples for inclusion in a given study and through the division of samples into calibration and validation sets (including spiking approaches). Here we look at the variation in VisNIR performance statistics for the following soil-spectra datasets: (1) a diverse US Soil Survey soil-spectral library with 3768 samples from all 50 states and 36 different countries; (2) 389 surface and subsoil samples taken from US Geological Survey continental transects; (3) the Texas Soil Spectral Library (TSSL) with 3000 samples; (4) intact soil core scans of Texas soils with 700 samples; (5) approximately 400 in situ scans from the Pacific Northwest region; and (6) miscellaneous local datasets. We find the variation in performance statistics to be surprisingly large. This has important

  17. Interpretations

    NASA Astrophysics Data System (ADS)

    Bellac, Michel Le

    2014-11-01

    Although nobody can question the practical efficiency of quantum mechanics, there remains the serious question of its interpretation. As Valerio Scarani puts it, "We do not feel at ease with the indistinguishability principle (that is, the superposition principle) and some of its consequences." Indeed, this principle which pervades the quantum world is in stark contradiction with our everyday experience. From the very beginning of quantum mechanics, a number of physicists--but not the majority of them!--have asked the question of its "interpretation". One may simply deny that there is a problem: according to proponents of the minimalist interpretation, quantum mechanics is self-sufficient and needs no interpretation. The point of view held by a majority of physicists, that of the Copenhagen interpretation, will be examined in Section 10.1. The crux of the problem lies in the status of the state vector introduced in the preceding chapter to describe a quantum system, which is no more than a symbolic representation for the Copenhagen school of thought. Conversely, one may try to attribute some "external reality" to this state vector, that is, a correspondence between the mathematical description and the physical reality. In this latter case, it is the measurement problem which is brought to the fore. In 1932, von Neumann was first to propose a global approach, in an attempt to build a purely quantum theory of measurement examined in Section 10.2. This theory still underlies modern approaches, among them those grounded on decoherence theory, or on the macroscopic character of the measuring apparatus: see Section 10.3. Finally, there are non-standard interpretations such as Everett's many worlds theory or the hidden variables theory of de Broglie and Bohm (Section 10.4). Note, however, that this variety of interpretations has no bearing whatsoever on the practical use of quantum mechanics. There is no controversy on the way we should use quantum mechanics!

  18. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    NASA Astrophysics Data System (ADS)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  19. Differences in paleomagnetic interpretations due to the choice of statistical, demagnetization and correction techniques: Kapuskasing Structural Zone, northern Ontario, Canada

    NASA Astrophysics Data System (ADS)

    Borradaile, Graham J.; Werner, Tomasz; Lagroix, France

    2003-02-01

    The Kapuskasing Structural Zone (KSZ) reveals a section through the Archean lower crustal granoblastic gneisses. Our new paleomagnetic data largely agree with previous work but we show that interpretations vary according to the choices of statistical, demagnetization and field-correction techniques. First, where the orientation distribution of characteristic remanence directions on the sphere is not symmetrically circular, the commonly used statistical model is invalid [Fisher, R.A., Proc. R. Soc. A217 (1953) 295]. Any tendency to form an elliptical distribution indicates that the sample is drawn from a Bingham-type population [Bingham, C., 1964. Distributions on the sphere and on the projective plane. PhD thesis, Yale University]. Fisher and Bingham statistics produce different confidence estimates from the same data and the traditionally defined mean vector may differ from the maximum eigenvector of an orthorhombic Bingham distribution. It seems prudent to apply both models wherever a non-Fisher population is suspected and that may be appropriate in any tectonized rocks. Non-Fisher populations require larger sample sizes so that focussing on individual sites may not be the most effective policy in tectonized rocks. More dispersed sampling across tectonic structures may be more productive. Second, from the same specimens, mean vectors isolated by thermal and alternating field (AF) demagnetization differ. Which treatment gives more meaningful results is difficult to decipher, especially in metamorphic rocks where the history of the magnetic minerals is not easily related to the ages of tectonic and petrological events. In this study, thermal demagnetization gave lower inclinations for paleomagnetic vectors and thus more distant paleopoles. Third, of more parochial significance, tilt corrections may be unnecessary in the KSZ because magnetic fabrics and thrust ramp are constant in orientation to the depth at which they level off, at approximately 15-km depth. With

  20. Statistical and population genetics issues of two Hungarian datasets from the aspect of DNA evidence interpretation.

    PubMed

    Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma

    2015-11-01

    When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies.

  1. Interpretation of seasonal water quality variation in the Yeongsan Reservoir, Korea using multivariate statistical analyses.

    PubMed

    Cho, Kyung Hwa; Park, Yongeun; Kang, Joo-Hyon; Ki, Seo Jin; Cha, Sungmin; Lee, Seung Won; Kim, Joon Ha

    2009-01-01

    The Yeongsan (YS) Reservoir is an estuarine reservoir which provides surrounding areas with public goods, such as water supply for agricultural and industrial areas and flood control. Beneficial uses of the YS Reservoir, however, are recently threatened by enriched non-point and point source inputs. A series of multivariate statistical approaches including principal component analysis (PCA) were applied to extract significant characteristics contained in a large suite of water quality data (18 variables monthly recorded for 5 years); thereby to provide the important phenomenal information for establishing effective water resource management plans for the YS Reservoir. The PCA results identified the most important five principal components (PCs), explaining 71% of total variance of the original data set. The five PCs were interpreted as hydro-meteorological effect, nitrogen loading, phosphorus loading, primary production of phytoplankton, and fecal indicator bacteria (FIB) loading. Furthermore, hydro-meteorological effect and nitrogen loading could be characterized by a yearly periodicity whereas FIB loading showed an increasing trend with respect to time. The study results presented here might be useful to establish preliminary strategies for abating water quality degradation in the YS Reservoir.

  2. Statistics in brief: the importance of sample size in the planning and interpretation of medical research.

    PubMed

    Biau, David Jean; Kernéis, Solen; Porcher, Raphaël

    2008-09-01

    The increasing volume of research by the medical community often leads to increasing numbers of contradictory findings and conclusions. Although the differences observed may represent true differences, the results also may differ because of sampling variability as all studies are performed on a limited number of specimens or patients. When planning a study reporting differences among groups of patients or describing some variable in a single group, sample size should be considered because it allows the researcher to control for the risk of reporting a false-negative finding (Type II error) or to estimate the precision his or her experiment will yield. Equally important, readers of medical journals should understand sample size because such understanding is essential to interpret the relevance of a finding with regard to their own patients. At the time of planning, the investigator must establish (1) a justifiable level of statistical significance, (2) the chances of detecting a difference of given magnitude between the groups compared, ie, the power, (3) this targeted difference (ie, effect size), and (4) the variability of the data (for quantitative data). We believe correct planning of experiments is an ethical issue of concern to the entire community.

  3. Evaluation of uncertainty of the comprehensive interpretation of borehole logs by the multiple re-run statistical method.

    PubMed

    Woźnicka, U; Jarzyna, J; Krynicka, E

    2005-05-01

    Measurements of various physical quantities in a borehole by geophysical well logging tools are designed to determine these quantities for underground geological formations. Then, the raw data (logs) are combined in a comprehensive interpretation to obtain values of geological parameters. Estimating the uncertainty of calculated geological parameters, interpreted in such a way, is difficult, often impossible, when classical statistical methods are used. The method presented here permits an estimate of the uncertainty of a quantity to be obtained. The discussion of the dependence between the uncertainty of nuclear and acoustic tool responses, and the estimated uncertainty of the interpreted geological parameters (among others: porosity, water saturation, clay content) is presented.

  4. Uses and Misuses of Student Evaluations of Teaching: The Interpretation of Differences in Teaching Evaluation Means Irrespective of Statistical Information

    ERIC Educational Resources Information Center

    Boysen, Guy A.

    2015-01-01

    Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…

  5. Interpreting the Results of Diagnostic Testing: Some Statistics for Testing in Real Time. Methodology Project.

    ERIC Educational Resources Information Center

    McArthur, David; Chou, Chih-Ping

    Diagnostic testing confronts several challenges at once, among which are issues of test interpretation and immediate modification of the test itself in response to the interpretation. Several methods are available for administering and evaluating a test in real-time, towards optimizing the examiner's chances of isolating a persistent pattern of…

  6. Dose impact in radiographic lung injury following lung SBRT: Statistical analysis and geometric interpretation

    SciTech Connect

    Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan

    2014-03-15

    Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has

  7. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    ERIC Educational Resources Information Center

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  8. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling

    PubMed Central

    Onisko, Agnieszka; Druzdzel, Marek J.; Austin, R. Marshall

    2016-01-01

    Background: Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. Aim: The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. Materials and Methods: This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan–Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. Results: The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Conclusion: Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches. PMID:28163973

  9. Statistics for the time-dependent failure of Kevlar-49/epoxy composites: micromechanical modeling and data interpretation

    SciTech Connect

    Phoenix, S.L.; Wu, E.M.

    1983-03-01

    This paper presents some new data on the strength and stress-rupture of Kevlar-49 fibers, fiber/epoxy strands and pressure vessels, and consolidated data obtained at LLNL over the past 10 years. This data are interpreted by using recent theoretical results from a micromechanical model of the statistical failure process, thereby gaining understanding of the roles of the epoxy matrix and ultraviolet radiation on long term lifetime.

  10. Boyle temperature as a point of ideal gas in gentile statistics and its economic interpretation

    NASA Astrophysics Data System (ADS)

    Maslov, V. P.; Maslova, T. V.

    2014-07-01

    Boyle temperature is interpreted as the temperature at which the formation of dimers becomes impossible. To Irving Fisher's correspondence principle we assign two more quantities: the number of degrees of freedom, and credit. We determine the danger level of the mass of money M when the mutual trust between economic agents begins to fall.

  11. Statistical approaches for enhancing causal interpretation of the M to Y relation in mediation analysis.

    PubMed

    MacKinnon, David P; Pirlott, Angela G

    2015-02-01

    Statistical mediation methods provide valuable information about underlying mediating psychological processes, but the ability to infer that the mediator variable causes the outcome variable is more complex than widely known. Researchers have recently emphasized how violating assumptions about confounder bias severely limits causal inference of the mediator to dependent variable relation. Our article describes and addresses these limitations by drawing on new statistical developments in causal mediation analysis. We first review the assumptions underlying causal inference and discuss three ways to examine the effects of confounder bias when assumptions are violated. We then describe four approaches to address the influence of confounding variables and enhance causal inference, including comprehensive structural equation models, instrumental variable methods, principal stratification, and inverse probability weighting. Our goal is to further the adoption of statistical methods to enhance causal inference in mediation studies.

  12. Interpreting the γ Statistic in Phylogenetic Diversification Rate Studies: A Rate Decrease Does Not Necessarily Indicate an Early Burst

    PubMed Central

    Fordyce, James A.

    2010-01-01

    Background Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the γ statistic. Methodology Using simulations under varying conditions, I examine the sensitivity of γ to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant γ statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the γ statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of γ to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. Conclusions The γ statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The γ statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the γ statistic as an indication of early, rapid diversification. PMID:20668707

  13. US Geological Survey nutrient preservation experiment : experimental design, statistical analysis, and interpretation of analytical results

    USGS Publications Warehouse

    Patton, Charles J.; Gilroy, Edward J.

    1999-01-01

    Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.

  14. Differences in paleomagnetic interpretations due to the choice of statistical, demagnetization and correction techniques: Kapuskasing Structural Zone, N.Ontario, Canada

    NASA Astrophysics Data System (ADS)

    Borradaile, G. J.; Werner, T.; Lagroix, F.

    2003-04-01

    The Kapuskasing Structural Zone (KSZ) reveals a section through Archean lower Crustal, granoblastic gneisses. Our new paleomagnetic data largely agrees with previous work but we show that interpretations vary according to the choices of statistical, demagnetization and field-correction techniques. First, where the orientation-distribution of characteristic remanence directions on the sphere is not circular-symmetrical, the commonly used statistical model is invalid (Fisher, 1953). Any tendency to form an elliptical distribution indicates the sample is drawn from a Bingham-type population (Bingham, 1964). Fisher and Bingham statistics produce different confidence estimates from the same data and the traditionally defined mean-vector may differ from the maximum eigenvector of an orthorhombic Bingham-distribution. It seems prudent to apply both models wherever a non-Fisher population is suspected and that may be appropriate in any tectonized rocks. Non-Fisher populations require larger sample-sizes so that focussing on individual sites may not be the most effective policy in tectonized rocks. More dispersed sampling across tectonic structures may be more productive. Second, from the same specimens, mean-vectors isolated by thermal and by AF demagnetization differ. Which treatment gives more meaningful results is difficult to decipher, especially in metamorphic rocks where the history of the magnetic minerals is not easily related to the ages of tectonic and petrological events. In this study, thermal demagnetization gave lower inclinations for paleomagnetic vectors and thus more distant paleopoles. Third, of more parochial significance, tilt-corrections may be unnecessary in the KSZ because magnetic fabrics and the thrust-ramp are constant in orientation to the depth at which they level off, at approximately 15km depth. With Archean geothermal gradients primary remanences were blocked after the foliation was tilted to rise on the thrust ramp. Therefore, the rocks were

  15. Statistical Approaches to Interpretation of Local, Regional, and National Highway-Runoff and Urban-Stormwater Data

    USGS Publications Warehouse

    Tasker, Gary D.; Granato, Gregory E.

    2000-01-01

    Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques

  16. Statistic-mathematical interpretation of some assessment parameters of the grassland ecosystem according to soil characteristics

    NASA Astrophysics Data System (ADS)

    Samfira, Ionel; Boldea, Marius; Popescu, Cosmin

    2012-09-01

    Significant parameters of permanent grasslands are represented by the pastoral value and Shannon and Simpson biodiversity indices. The dynamics of these parameters has been studied in several plant associations in Banat Plain, Romania. From the point of view of their typology, these permanent grasslands belong to the steppe area, series Festuca pseudovina, type Festuca pseudovina-Achilea millefolium, subtype Lolium perenne. The methods used for the purpose of this research included plant cover analysis (double meter method, calculation of Shannon and Simpson indices), and statistical methods of regression and correlation. The results show that, in the permanent grasslands in the plain region, when the pastoral value is average to low, the level of interspecific biodiversity is on the increase.

  17. Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays

    NASA Astrophysics Data System (ADS)

    Sibatov, R. T.

    2011-08-01

    A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.

  18. A statistical approach to the interpretation of aliphatic hydrocarbon distributions in marine sediments

    USGS Publications Warehouse

    Rapp, J.B.

    1991-01-01

    Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.

  19. A COMPREHENSIVE STATISTICALLY-BASED METHOD TO INTERPRET REAL-TIME FLOWING MEASUREMENTS

    SciTech Connect

    Pinan Dawkrajai; Analis A. Romero; Keita Yoshioka; Ding Zhu; A.D. Hill; Larry W. Lake

    2004-10-01

    In this project, we are developing new methods for interpreting measurements in complex wells (horizontal, multilateral and multi-branching wells) to determine the profiles of oil, gas, and water entry. These methods are needed to take full advantage of ''smart'' well instrumentation, a technology that is rapidly evolving to provide the ability to continuously and permanently monitor downhole temperature, pressure, volumetric flow rate, and perhaps other fluid flow properties at many locations along a wellbore; and hence, to control and optimize well performance. In this first year, we have made considerable progress in the development of the forward model of temperature and pressure behavior in complex wells. In this period, we have progressed on three major parts of the forward problem of predicting the temperature and pressure behavior in complex wells. These three parts are the temperature and pressure behaviors in the reservoir near the wellbore, in the wellbore or laterals in the producing intervals, and in the build sections connecting the laterals, respectively. Many models exist to predict pressure behavior in reservoirs and wells, but these are almost always isothermal models. To predict temperature behavior we derived general mass, momentum, and energy balance equations for these parts of the complex well system. Analytical solutions for the reservoir and wellbore parts for certain special conditions show the magnitude of thermal effects that could occur. Our preliminary sensitivity analyses show that thermal effects caused by near-wellbore reservoir flow can cause temperature changes that are measurable with smart well technology. This is encouraging for the further development of the inverse model.

  20. A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements

    SciTech Connect

    Keita Yoshioka; Pinan Dawkrajai; Analis A. Romero; Ding Zhu; A. D. Hill; Larry W. Lake

    2007-01-15

    With the recent development of temperature measurement systems, continuous temperature profiles can be obtained with high precision. Small temperature changes can be detected by modern temperature measuring instruments such as fiber optic distributed temperature sensor (DTS) in intelligent completions and will potentially aid the diagnosis of downhole flow conditions. In vertical wells, since elevational geothermal changes make the wellbore temperature sensitive to the amount and the type of fluids produced, temperature logs can be used successfully to diagnose the downhole flow conditions. However, geothermal temperature changes along the wellbore being small for horizontal wells, interpretations of a temperature log become difficult. The primary temperature differences for each phase (oil, water, and gas) are caused by frictional effects. Therefore, in developing a thermal model for horizontal wellbore, subtle temperature changes must be accounted for. In this project, we have rigorously derived governing equations for a producing horizontal wellbore and developed a prediction model of the temperature and pressure by coupling the wellbore and reservoir equations. Also, we applied Ramey's model (1962) to the build section and used an energy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases at varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section. With the prediction models developed, we present inversion studies of synthetic and field examples. These results are essential to identify water or gas entry, to guide flow control devices in intelligent completions, and to decide if reservoir stimulation is needed in particular horizontal sections. This study will complete and validate these inversion studies.

  1. A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements

    SciTech Connect

    Pinan Dawkrajai; Keita Yoshioka; Analis A. Romero; Ding Zhu; A.D. Hill; Larry W. Lake

    2005-10-01

    This project is motivated by the increasing use of distributed temperature sensors for real-time monitoring of complex wells (horizontal, multilateral and multi-branching wells) to infer the profiles of oil, gas, and water entry. Measured information can be used to interpret flow profiles along the wellbore including junction and build section. In this second project year, we have completed a forward model to predict temperature and pressure profiles in complex wells. As a comprehensive temperature model, we have developed an analytical reservoir flow model which takes into account Joule-Thomson effects in the near well vicinity and multiphase non-isothermal producing wellbore model, and couples those models accounting mass and heat transfer between them. For further inferences such as water coning or gas evaporation, we will need a numerical non-isothermal reservoir simulator, and unlike existing (thermal recovery, geothermal) simulators, it should capture subtle temperature change occurring in a normal production. We will show the results from the analytical coupled model (analytical reservoir solution coupled with numerical multi-segment well model) to infer the anomalous temperature or pressure profiles under various conditions, and the preliminary results from the numerical coupled reservoir model which solves full matrix including wellbore grids. We applied Ramey's model to the build section and used an enthalpy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section.

  2. Adsorption of ethanol onto activated carbon: Modeling and consequent interpretations based on statistical physics treatment

    NASA Astrophysics Data System (ADS)

    Bouzid, Mohamed; Sellaoui, Lotfi; Khalfaoui, Mohamed; Belmabrouk, Hafedh; Lamine, Abdelmottaleb Ben

    2016-02-01

    In this work, we studied the adsorption of ethanol on three types of activated carbon, namely parent Maxsorb III and two chemically modified activated carbons (H2-Maxsorb III and KOH-H2-Maxsorb III). This investigation has been conducted on the basis of the grand canonical formalism in statistical physics and on simplified assumptions. This led to three parameter equations describing the adsorption of ethanol onto the three types of activated carbon. There was a good correlation between experimental data and results obtained by the new proposed equation. The parameters characterizing the adsorption isotherm were the number of adsorbed molecules (s) per site n, the density of the receptor sites per unit mass of the adsorbent Nm, and the energetic parameter p1/2. They were estimated for the studied systems by a non linear least square regression. The results show that the ethanol molecules were adsorbed in perpendicular (or non parallel) position to the adsorbent surface. The magnitude of the calculated adsorption energies reveals that ethanol is physisorbed onto activated carbon. Both van der Waals and hydrogen interactions were involved in the adsorption process. The calculated values of the specific surface AS, proved that the three types of activated carbon have a highly microporous surface.

  3. Statistical information of ASAR observations over wetland areas: An interaction model interpretation

    NASA Astrophysics Data System (ADS)

    Grings, F.; Salvia, M.; Karszenbaum, H.; Ferrazzoli, P.; Perna, P.; Barber, M.; Jacobo Berlles, J.

    2010-01-01

    This paper presents the results obtained after studying the relation between the statistical parameters that describe the backscattering distribution of junco marshes and their biophysical variables. The results are based on the texture analysis of a time series of Envisat ASAR C-band data (APP mode, V V +HH polarizations) acquired between October 2003 and January 2005 over the Lower Paraná River Delta, Argentina. The image power distributions were analyzed, and we show that the K distribution provides a good fitting of SAR data extracted from wetland observations for both polarizations. We also show that the estimated values of the order parameter of the K distribution can be explained using fieldwork and reasonable assumptions. In order to explore these results, we introduce a radiative transfer based interaction model to simulate the junco marsh σ0 distribution. After analyzing model simulations, we found evidence that the order parameter is related to the junco plant density distribution inside the junco marsh patch. It is concluded that the order parameter of the K distribution could be a useful parameter to estimate the junco plant density. This result is important for basin hydrodynamic modeling, since marsh plant density is the most important parameter to estimate marsh water conductance.

  4. Hysteresis model and statistical interpretation of energy losses in non-oriented steels

    NASA Astrophysics Data System (ADS)

    Mănescu (Păltânea), Veronica; Păltânea, Gheorghe; Gavrilă, Horia

    2016-04-01

    In this paper the hysteresis energy losses in two non-oriented industrial steels (M400-65A and M800-65A) were determined, by means of an efficient classical Preisach model, which is based on the Pescetti-Biorci method for the identification of the Preisach density. The excess and the total energy losses were also determined, using a statistical framework, based on magnetic object theory. The hysteresis energy losses, in a non-oriented steel alloy, depend on the peak magnetic polarization and they can be computed using a Preisach model, due to the fact that in these materials there is a direct link between the elementary rectangular loops and the discontinuous character of the magnetization process (Barkhausen jumps). To determine the Preisach density it was necessary to measure the normal magnetization curve and the saturation hysteresis cycle. A system of equations was deduced and the Preisach density was calculated for a magnetic polarization of 1.5 T; then the hysteresis cycle was reconstructed. Using the same pattern for the Preisach distribution, it was computed the hysteresis cycle for 1 T. The classical losses were calculated using a well known formula and the excess energy losses were determined by means of the magnetic object theory. The total energy losses were mathematically reconstructed and compared with those, measured experimentally.

  5. Salinization of groundwater around underground LPG storage caverns, Korea : statistical interpretation

    NASA Astrophysics Data System (ADS)

    Lee, J.; Chang, H.

    2001-12-01

    In this research, we investigate the reciprocal influence between groundwater flow and its salinization occurred in two underground cavern sites, using major ion chemistry, PCA for chemical analysis data, and cross-correlation for various hydraulic data. The study areas are two underground LPG storage facilities constructed in South Sea coast, Yosu, and West Sea coastal regions, Pyeongtaek, Korea. Considerably high concentration of major cations and anions of groundwaters at both sites showed brackish or saline water types. In Yosu site, some great chemical difference of groundwater samples between rainy and dry season was caused by temporal intrusion of high-saline water into propane and butane cavern zone, but not in Pyeongtaek site. Cl/Br ratios and δ 18O- δ D distribution for tracing of salinization source water in both sites revealed that two kind of saline water (seawater and halite-dissolved solution) could influence the groundwater salinization in Yosu site, whereas only seawater intrusion could affect the groundwater chemistry of the observation wells in Pyeongtaek site. PCA performed by 8 and 10 chemical ions as statistical variables in both sites showed that intensive intrusion of seawater through butane cavern was occurred at Yosu site while seawater-groundwater mixing was observed at some observation wells located in the marginal part of Pyeongtaek site. Cross-correlation results revealed that the positive relationship between hydraulic head and cavern operating pressure was far more conspicuous at propane cavern zone in both sites (65 ~90% of correlation coefficients). According to the cross-correlation results of Yosu site, small change of head could provoke massive influx of halite-dissolved solution from surface through vertically developed fracture networks. However in Pyeongtaek site, the pressure-sensitive observation wells are not completely consistent with seawater-mixed wells, and the hydraulic change of heads at these wells related to the

  6. QC Metrics from CPTAC Raw LC-MS/MS Data Interpreted through Multivariate Statistics

    PubMed Central

    2015-01-01

    Shotgun proteomics experiments integrate a complex sequence of processes, any of which can introduce variability. Quality metrics computed from LC-MS/MS data have relied upon identifying MS/MS scans, but a new mode for the QuaMeter software produces metrics that are independent of identifications. Rather than evaluating each metric independently, we have created a robust multivariate statistical toolkit that accommodates the correlation structure of these metrics and allows for hierarchical relationships among data sets. The framework enables visualization and structural assessment of variability. Study 1 for the Clinical Proteomics Technology Assessment for Cancer (CPTAC), which analyzed three replicates of two common samples at each of two time points among 23 mass spectrometers in nine laboratories, provided the data to demonstrate this framework, and CPTAC Study 5 provided data from complex lysates under Standard Operating Procedures (SOPs) to complement these findings. Identification-independent quality metrics enabled the differentiation of sites and run-times through robust principal components analysis and subsequent factor analysis. Dissimilarity metrics revealed outliers in performance, and a nested ANOVA model revealed the extent to which all metrics or individual metrics were impacted by mass spectrometer and run time. Study 5 data revealed that even when SOPs have been applied, instrument-dependent variability remains prominent, although it may be reduced, while within-site variability is reduced significantly. Finally, identification-independent quality metrics were shown to be predictive of identification sensitivity in these data sets. QuaMeter and the associated multivariate framework are available from http://fenchurch.mc.vanderbilt.edu and http://homepages.uc.edu/~wang2x7/, respectively. PMID:24494671

  7. Progress report on the guidance for industry for statistical aspects of the design, analysis, and interpretation of chronic rodent carcinogenicity studies of pharmaceuticals.

    PubMed

    Lin, K K

    2000-11-01

    The U.S. Food and Drug Administration (FDA) is in the process of preparing a draft Guidance for Industry document on the statistical aspects of carcinogenicity studies of pharmaceuticals for public comment. The purpose of the document is to provide statistical guidance for the design of carcinogenicity experiments, methods of statistical analysis of study data, interpretation of study results, presentation of data and results in reports, and submission of electronic study data. This article covers the genesis of the guidance document and some statistical methods in study design, data analysis, and interpretation of results included in the draft FDA guidance document.

  8. Flexible magnetic planning boards are easily transported

    NASA Technical Reports Server (NTRS)

    1965-01-01

    Easily transportable preprinted magnetic planning boards are made by coating thin sheet steel with clear plastic. Flexible magnetic boards used with paper charts are constructed from close mesh steel screen.

  9. An Easily Constructed Trigonal Prism Model.

    ERIC Educational Resources Information Center

    Yamana, Shukichi

    1984-01-01

    A model of a trigonal prism which is useful for teaching stereochemistry (especially of the neodymium enneahydrate ion), can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  10. Using Statistical Mechanics and Entropy Principles to Interpret Variability in Power Law Models of the Streamflow Recession

    NASA Astrophysics Data System (ADS)

    Dralle, D.; Karst, N.; Thompson, S. E.

    2015-12-01

    Multiple competing theories suggest that power law behavior governs the observed first-order dynamics of streamflow recessions - the important process by which catchments dry-out via the stream network, altering the availability of surface water resources and in-stream habitat. Frequently modeled as: dq/dt = -aqb, recessions typically exhibit a high degree of variability, even within a single catchment, as revealed by significant shifts in the values of "a" and "b" across recession events. One potential source of this variability lies in underlying, hard-to-observe fluctuations in how catchment water storage is partitioned amongst distinct storage elements, each having different discharge behaviors. Testing this and competing hypotheses with widely available streamflow timeseries, however, has been hindered by a power law scaling artifact that obscures meaningful covariation between the recession parameters, "a" and "b". Here we briefly outline a technique that removes this artifact, revealing intriguing new patterns in the joint distribution of recession parameters. Using long-term flow data from catchments in Northern California, we explore temporal variations, and find that the "a" parameter varies strongly with catchment wetness. Then we explore how the "b" parameter changes with "a", and find that measures of its variation are maximized at intermediate "a" values. We propose an interpretation of this pattern based on statistical mechanics, meaning "b" can be viewed as an indicator of the catchment "microstate" - i.e. the partitioning of storage - and "a" as a measure of the catchment macrostate (i.e. the total storage). In statistical mechanics, entropy (i.e. microstate variance, that is the variance of "b") is maximized for intermediate values of extensive variables (i.e. wetness, "a"), as observed in the recession data. This interpretation of "a" and "b" was supported by model runs using a multiple-reservoir catchment toy model, and lends support to the

  11. Weighted Feature Significance: A Simple, Interpretable Model of Compound Toxicity Based on the Statistical Enrichment of Structural Features

    PubMed Central

    Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.

    2009-01-01

    In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation. PMID:19805409

  12. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  13. ACECARD. Acquire CoOmmodities Easily Card

    SciTech Connect

    Soler, E.E.

    1996-09-01

    Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.

  14. Acquire CoOmmodities Easily Card

    SciTech Connect

    Soler, E. E.

    1998-05-29

    Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.

  15. A Note on the Calculation and Interpretation of the Delta-p Statistic for Categorical Independent Variables

    ERIC Educational Resources Information Center

    Cruce, Ty M.

    2009-01-01

    This methodological note illustrates how a commonly used calculation of the Delta-p statistic is inappropriate for categorical independent variables, and this note provides users of logistic regression with a revised calculation of the Delta-p statistic that is more meaningful when studying the differences in the predicted probability of an…

  16. Statistical factor analysis technique for characterizing basalt through interpreting nuclear and electrical well logging data (case study from Southern Syria).

    PubMed

    Asfahani, Jamal

    2014-02-01

    Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted.

  17. Combination of statistical methods and Fourier transform ion cyclotron resonance mass spectrometry for more comprehensive, molecular-level interpretations of petroleum samples.

    PubMed

    Hur, Manhoi; Yeo, Injoon; Park, Eunsuk; Kim, Young Hwan; Yoo, Jongshin; Kim, Eunkyoung; No, Myoung-han; Koh, Jaesuk; Kim, Sunghwan

    2010-01-01

    Complex petroleum mass spectra obtained by Fourier-transform ion cyclotron resonance mass spectrometry (FTICR MS) were successfully interpreted at the molecular level by applying principle component analysis (PCA) and hierarchical clustering analysis (HCA). A total of 40 mass spectra were obtained from 20 crude oil samples using both positive and negative atmospheric pressure photoionization (APPI). Approximately 400,000 peaks were identified at the molecular level. Conventional data analyses would have been impractical with so much data. However, PCA grouped samples into score plots based on their molecular composition. In this way, the overall compositional difference between samples could be easily displayed and identified by comparing score and loading plots. HCA was also performed to group and compare samples based on selected peaks that had been grouped by PCA. Subsequent heat map analyses revealed detailed compositional differences among grouped samples. This study demonstrates a promising new approach for studying multiple, complex petroleum samples at the molecular level.

  18. Collegiate Enrollments in the U.S., 1979-80. Statistics, Interpretations, and Trends in 4-Year and Related Institutions.

    ERIC Educational Resources Information Center

    Mickler, J. Ernest

    This 60th annual report on collegiate enrollments in the United States is based on data received from 1,635 four-year institutions in the U.S., Puerto Rico, and the U.S. Territories. General notes, survey methodology notes, and a summary of findings are presented. Detailed statistical charts present institutional data on men and women students and…

  19. Interview with Yves Pomeau, Boltzmann Medallist 2016 : The universality of statistical physics interpretation is ever more obvious.

    PubMed

    Pomeau, Yves; Louët, Sabine

    2016-06-01

    During the StatPhys Conference on 20th July 2016 in Lyon, France, Yves Pomeau and Daan Frenkel will be awarded the most important prize in the field of Statistical Mechanics: the 2016 Boltzmann Medal, named after the Austrian physicist and philosopher Ludwig Boltzmann. The award recognises Pomeau's key contributions to the Statistical Physics of non-equilibrium phenomena in general. And, in particular, for developing our modern understanding of fluid mechanics, instabilities, pattern formation and chaos. He is recognised as an outstanding theorist bridging disciplines from applied mathematics to statistical physics with a profound impact on the neighbouring fields of turbulence and mechanics. In the article Sabine Louët interviews Pomeau, who is an Editor for the European Physical Journal Special Topics. He shares his views and tells how he experienced the rise of Statistical Mechanics in the past few decades. He also touches upon the need to provide funding to people who have the rare ability to discover new things and ideas, and not just those who are good at filling in grant application forms.

  20. Analysis of the procedures used to evaluate suicide crime scenes in Brazil: a statistical approach to interpret reports.

    PubMed

    Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira

    2014-08-01

    This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences.

  1. Skew-laplace and cell-size distribution in microbial axenic cultures: statistical assessment and biological interpretation.

    PubMed

    Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S; Vives-Rego, Josep

    2010-01-01

    We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting.

  2. Skew-Laplace and Cell-Size Distribution in Microbial Axenic Cultures: Statistical Assessment and Biological Interpretation

    PubMed Central

    Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S.; Vives-Rego, Josep

    2010-01-01

    We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754

  3. Chemical data and statistical interpretations for rocks and ores from the Ranger uranium mine, Northern Territory, Australia

    USGS Publications Warehouse

    Nash, J. Thomas; Frishman, David

    1983-01-01

    Analytical results for 61 elements in 370 samples from the Ranger Mine area are reported. Most of the rocks come from drill core in the Ranger No. 1 and Ranger No. 3 deposits, but 20 samples are from unmineralized drill core more than 1 km from ore. Statistical tests show that the elements Mg, Fe, F, Be, Co, Li, Ni, Pb, Sc, Th, Ti, V, CI, As, Br, Au, Ce, Dy, La Sc, Eu, Tb, Yb, and Tb have positive association with uranium, and Si, Ca, Na, K, Sr, Ba, Ce, and Cs have negative association. For most lithologic subsets Mg, Fe, Li, Cr, Ni, Pb, V, Y, Sm, Sc, Eu, and Yb are significantly enriched in ore-bearing rocks, whereas Ca, Na, K, Sr, Ba, Mn, Ce, and Cs are significantly depleted. These results are consistent with petrographic observations on altered rocks. Lithogeochemistry can aid exploration, but for these rocks requires methods that are expensive and not amenable to routine use.

  4. Hydrochemical and multivariate statistical interpretations of spatial controls of nitrate concentrations in a shallow alluvial aquifer around oxbow lakes (Osong area, central Korea).

    PubMed

    Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo

    2009-07-21

    Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean=35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets.

  5. Hydrochemical and multivariate statistical interpretations of spatial controls of nitrate concentrations in a shallow alluvial aquifer around oxbow lakes (Osong area, central Korea)

    NASA Astrophysics Data System (ADS)

    Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo

    2009-07-01

    Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean = 35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets.

  6. Proper interpretation of chronic toxicity studies and their statistics: A critique of "Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example".

    PubMed

    Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol

    2015-09-02

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors.

  7. On the likelihood of post-perovskite near the core-mantle boundary: A statistical interpretation of seismic observations

    NASA Astrophysics Data System (ADS)

    Cobden, Laura; Mosca, Ilaria; Trampert, Jeannot; Ritsema, Jeroen

    2012-11-01

    Recent experimental studies indicate that perovskite, the dominant lower mantle mineral, undergoes a phase change to post-perovskite at high pressures. However, it has been unclear whether this transition occurs within the Earth's mantle, due to uncertainties in both the thermochemical state of the lowermost mantle and the pressure-temperature conditions of the phase boundary. In this study we compare the relative fit to global seismic data of mantle models which do and do not contain post-perovskite, following a statistical approach. Our data comprise more than 10,000 Pdiff and Sdiff travel-times, global in coverage, from which we extract the global distributions of dln VS and dln VP near the core-mantle boundary (CMB). These distributions are sensitive to the underlying lateral variations in mineralogy and temperature even after seismic uncertainties are taken into account, and are ideally suited for investigating the likelihood of the presence of post-perovskite. A post-perovskite-bearing CMB region provides a significantly closer fit to the seismic data than a post-perovskite-free CMB region on both a global and regional scale. These results complement previous local seismic reflection studies, which have shown a consistency between seismic observations and the physical properties of post-perovskite inside the deep Earth.

  8. Multivariate Statistical Analysis as a Supplementary Tool for Interpretation of Variations in Salivary Cortisol Level in Women with Major Depressive Disorder

    PubMed Central

    Dziurkowska, Ewelina; Wesolowski, Marek

    2015-01-01

    Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods. PMID:26380376

  9. Multivariate Statistical Analysis as a Supplementary Tool for Interpretation of Variations in Salivary Cortisol Level in Women with Major Depressive Disorder.

    PubMed

    Dziurkowska, Ewelina; Wesolowski, Marek

    2015-01-01

    Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods.

  10. Pulsar statistics and their interpretations

    NASA Technical Reports Server (NTRS)

    Arnett, W. D.; Lerche, I.

    1981-01-01

    It is shown that a lack of knowledge concerning interstellar electron density, the true spatial distribution of pulsars, the radio luminosity source distribution of pulsars, the real ages and real aging rates of pulsars, the beaming factor (and other unknown factors causing the known sample of about 350 pulsars to be incomplete to an unknown degree) is sufficient to cause a minimum uncertainty of a factor of 20 in any attempt to determine pulsar birth or death rates in the Galaxy. It is suggested that this uncertainty must impact on suggestions that the pulsar rates can be used to constrain possible scenarios for neutron star formation and stellar evolution in general.

  11. Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets

    ERIC Educational Resources Information Center

    Kulp, Christopher W.; Sprechini, Gene D.

    2016-01-01

    A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…

  12. An Easily Constructed Model of a Square Antiprism.

    ERIC Educational Resources Information Center

    Yamana, Shukichi

    1984-01-01

    A model of a square antiprism which is necessary for teaching stereochemistry (for example, of the octafluorotantalate ion) can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  13. Easily constructed mini-sextant demonstrates optical principles

    NASA Astrophysics Data System (ADS)

    Nenninger, Garet G.

    2000-04-01

    An easily constructed optical instrument for measuring the angle between the Sun and the horizon is described. The miniature sextant relies on multiple reflections to produce multiple images of the sun at fixed angles away from the true Sun.

  14. Description of the Experimental Avionics Systems Integration Laboratory (EASILY)

    NASA Technical Reports Server (NTRS)

    Outlaw, Bruce K. E.

    1994-01-01

    The Experimental Avionics Systems Integration Laboratory (EASILY) is a comprehensive facility used for development, integration, and preflight validation of hardware and software systems for the Terminal Area Productivity (TAP) Program's Transport Systems Research Vehicle (TSRV) experimental transport aircraft. This report describes the history, capabilities, and subsystems of EASILY. A functional description of the many subsystems is provided to give potential users the necessary knowledge of the capabilities of this facility.

  15. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions

    PubMed Central

    Cunningham, Michael R.; Baumeister, Roy F.

    2016-01-01

    The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger et al., 2010). Meta-analyses are supposed to reduce bias in literature reviews. Carter et al.’s (2015) meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and Funnel Plot Asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test and Precision Effect Estimate with Standard Error (PEESE) procedures. Despite these serious problems, the Carter et al. (2015) meta-analysis results actually indicate that there is a real depletion effect – contrary to their title. PMID:27826272

  16. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions.

    PubMed

    Cunningham, Michael R; Baumeister, Roy F

    2016-01-01

    The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger et al., 2010). Meta-analyses are supposed to reduce bias in literature reviews. Carter et al.'s (2015) meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and Funnel Plot Asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test and Precision Effect Estimate with Standard Error (PEESE) procedures. Despite these serious problems, the Carter et al. (2015) meta-analysis results actually indicate that there is a real depletion effect - contrary to their title.

  17. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    PubMed

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii

  18. Micromanipulation tool is easily adapted to many uses

    NASA Technical Reports Server (NTRS)

    Shlichta, P. J.

    1967-01-01

    A special micromanipulation tool equipped with a plunger mounted in a small tube can be easily adapted to such work operations as cutting, precision clamping, and spot welding of microscopic filaments or other parts. This tool is valuable where extreme steadiness of high magnification is required.

  19. Epoxy-coated containers easily opened by wire band

    NASA Technical Reports Server (NTRS)

    Mc Coy, J. W.

    1966-01-01

    Epoxy coating reduces punctures, abrasions, and contamination of synthetic cellular containers used for shipping and storing fragile goods and equipment. A wire band is wound around the closure joint, followed by the epoxy coating. The container can then be easily opened by pulling the wire through the epoxy around the joint.

  20. An easily assembled laboratory exercise in computed tomography

    NASA Astrophysics Data System (ADS)

    Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf

    2011-09-01

    In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near IR light of the photogate (880 nm) to scan objects hidden from the human eye. This experiment effectively conveys how an image is formed during a CT scan and highlights the important physical and imaging concepts behind CT such as electromagnetic radiation, the interaction of light and matter, artefacts and windowing. Like our setup, previous undergraduate level laboratory activities which teach the basics of CT have also utilized light sources rather than x-rays; however, they required a more extensive setup and used devices not always easily found in undergraduate laboratories. Our setup is easily implemented with equipment found in many teaching laboratories.

  1. Combining data visualization and statistical approaches for interpreting measurements and meta-data: Integrating heatmaps, variable clustering, and mixed regression models

    EPA Science Inventory

    The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...

  2. An easily fabricated high performance ionic polymer based sensor network

    NASA Astrophysics Data System (ADS)

    Zhu, Zicai; Wang, Yanjie; Hu, Xiaopin; Sun, Xiaofei; Chang, Longfei; Lu, Pin

    2016-08-01

    Ionic polymer materials can generate an electrical potential from ion migration under an external force. For traditional ionic polymer metal composite sensors, the output voltage is very small (a few millivolts), and the fabrication process is complex and time-consuming. This letter presents an ionic polymer based network of pressure sensors which is easily and quickly constructed, and which can generate high voltage. A 3 × 3 sensor array was prepared by casting Nafion solution directly over copper wires. Under applied pressure, two different levels of voltage response were observed among the nine nodes in the array. For the group producing the higher level, peak voltages reached as high as 25 mV. Computational stress analysis revealed the physical origin of the different responses. High voltages resulting from the stress concentration and asymmetric structure can be further utilized to modify subsequent designs to improve the performance of similar sensors.

  3. Plasmonic Films Can Easily Be Better: Rules and Recipes

    PubMed Central

    2015-01-01

    High-quality materials are critical for advances in plasmonics, especially as researchers now investigate quantum effects at the limit of single surface plasmons or exploit ultraviolet- or CMOS-compatible metals such as aluminum or copper. Unfortunately, due to inexperience with deposition methods, many plasmonics researchers deposit metals under the wrong conditions, severely limiting performance unnecessarily. This is then compounded as others follow their published procedures. In this perspective, we describe simple rules collected from the surface-science literature that allow high-quality plasmonic films of aluminum, copper, gold, and silver to be easily deposited with commonly available equipment (a thermal evaporator). Recipes are also provided so that films with optimal optical properties can be routinely obtained. PMID:25950012

  4. Easily installable behavioral monitoring system with electric field sensor.

    PubMed

    Tsukamoto, Sosuke; Machida, Yuichiro; Kameda, Noriyuki; Hoshino, Hiroshi; Tamura, Toshiyo

    2007-01-01

    This paper describes a wireless behavioral monitoring system equipped with an electric field sensor. The sensor unit was designed to obtain information regarding the usage of home electric appliances such as the television, microwave oven, coffee maker, etc. by measuring the electric field surrounding them. It is assumed that these usage statistics could provide information regarding the indoor behavior of a subject. Since the sensor can be used by simply attaching it to an appliance and does not require any wiring for its installation, this system can be temporarily installed in any ordinary house. A simple interface for selecting the threshold value of appliances' power on/off states was introduced. The experimental results reveal that the proposed system can be installed by individuals in their residences in a short time and the usage statistics of home appliances can be gathered.

  5. A highly versatile and easily configurable system for plant electrophysiology.

    PubMed

    Gunsé, Benet; Poschenrieder, Charlotte; Rankl, Simone; Schröeder, Peter; Rodrigo-Moreno, Ana; Barceló, Juan

    2016-01-01

    In this study we present a highly versatile and easily configurable system for measuring plant electrophysiological parameters and ionic flow rates, connected to a computer-controlled highly accurate positioning device. The modular software used allows easy customizable configurations for the measurement of electrophysiological parameters. Both the operational tests and the experiments already performed have been fully successful and rendered a low noise and highly stable signal. Assembly, programming and configuration examples are discussed. The system is a powerful technique that not only gives precise measuring of plant electrophysiological status, but also allows easy development of ad hoc configurations that are not constrained to plant studies. •We developed a highly modular system for electrophysiology measurements that can be used either in organs or cells and performs either steady or dynamic intra- and extracellular measurements that takes advantage of the easiness of visual object-oriented programming.•High precision accuracy in data acquisition under electrical noisy environments that allows it to run even in a laboratory close to electrical equipment that produce electrical noise.•The system makes an improvement of the currently used systems for monitoring and controlling high precision measurements and micromanipulation systems providing an open and customizable environment for multiple experimental needs.

  6. REPRESENTATIVE ORDERING AND SELECTION OF VARIABLES, VOLUMES A AND B--STATISTICAL MODELS FOR THE EVALUATION AND INTERPRETATION OF EDUCATIONAL CRITERIA, PART 3.

    ERIC Educational Resources Information Center

    BARGMANN, ROLF E.

    THE STUDIES EMBODIED IN THIS REPORT PROPOSE SOME STATISTICAL METHODS OF ORDERING AND ATTAINING RELEVANCY TO HELP THE EDUCATIONAL RESEARCHER CHOOSE AMONG SUCH VARIABLES AS TESTS AND BEHAVIOR RATINGS. CONSTRUCTION OF A MODEL FOR THE ANALYSIS OF CONTINGENCY TABLES, DETERMINATION OF THE MOST APPROPRIATE ORDERING PRINCIPLE IN STEP-DOWN ANALYSIS FOR THE…

  7. Statistics Clinic

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  8. Making large amounts of meteorological plots easily accessible to users

    NASA Astrophysics Data System (ADS)

    Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin

    2015-04-01

    implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.

  9. Technical Basis Document: A Statistical Basis for Interpreting Urinary Excretion of Plutonium Based on Accelerator Mass Spectrometry (AMS) for Selected Atoll Populations in the Marshall Islands

    SciTech Connect

    Bogen, K; Hamilton, T F; Brown, T A; Martinelli, R E; Marchetti, A A; Kehl, S R; Langston, R G

    2007-05-01

    We have developed refined statistical and modeling techniques to assess low-level uptake and urinary excretion of plutonium from different population group in the northern Marshall Islands. Urinary excretion rates of plutonium from the resident population on Enewetak Atoll and from resettlement workers living on Rongelap Atoll range from <1 to 8 {micro}Bq per day and are well below action levels established under the latest Department regulation 10 CFR 835 in the United States for in vitro bioassay monitoring of {sup 239}Pu. However, our statistical analyses show that urinary excretion of plutonium-239 ({sup 239}Pu) from both cohort groups is significantly positively associated with volunteer age, especially for the resident population living on Enewetak Atoll. Urinary excretion of {sup 239}Pu from the Enewetak cohort was also found to be positively associated with estimates of cumulative exposure to worldwide fallout. Consequently, the age-related trends in urinary excretion of plutonium from Marshallese populations can be described by either a long-term component from residual systemic burdens acquired from previous exposures to worldwide fallout or a prompt (and eventual long-term) component acquired from low-level systemic intakes of plutonium associated with resettlement of the northern Marshall Islands, or some combination of both.

  10. A simple quantum statistical thermodynamics interpretation of an impressive phase diagram pressure shift upon (H/D) isotopic substitution in water + 3-methylpyridine.

    PubMed

    Visak, Zoran P; Szydlowski, Jerzy; Rebelo, Luís P N

    2006-01-26

    In a previous work (J. Phys. Chem. B 2003, 107, 9837), we reported liquid-liquid-phase splitting at negative pressures in mixtures of H2O + D2O + 3-methylpyridine (3-MP) at the limit of pure H2O as the solvent, thus extending for the first time the L-L phase diagrams to this metastable region. We showed that there is an intimate relation between pressure and solvent deuterium content. Isotopic substitution (H/D) in water provokes subtle entropic effects that, in turn, trigger a significant pressure shift, opening a pressure-wide miscibility window of as much as 1600 bar. Isotope effects are quantum in origin. Therefore, a model that is both pressure-dependent and considers quantization constitutes a necessary tool if one wishes to fully describe the p, T, x critical demixing in these systems. In the current work, the statistical-mechanical theory of isotope effects is combined with a compressible pressure-dependent model. This combination enabled us to predict successfully the overall L-L phase diagram via differences in the vibrational mode frequencies of water on its transfer from the pure state to that of dilution in 3-MP: each of the three librational modes undergo a calculated red-shift of -(250 +/- 30) cm(-1), while the overall internal frequencies contribution is estimated as a total +(400 +/- 25) cm(-1) blue-shift.

  11. Statistical treatment and preliminary interpretation of chemical data from a uranium deposit in the northeast part of the Church Rock area, Gallup mining district, New Mexico

    USGS Publications Warehouse

    Spirakis, C.S.; Pierson, C.T.; Santos, E.S.; Fishman, N.S.

    1983-01-01

    Statistical treatment of analytical data from 106 samples of uranium-mineralized and unmineralized or weakly mineralized rocks of the Morrison Formation from the northeastern part of the Church Rock area of the Grants uranium region indicates that along with uranium, the deposits in the northeast Church Rock area are enriched in barium, sulfur, sodium, vanadium and equivalent uranium. Selenium and molybdenum are sporadically enriched in the deposits and calcium, manganese, strontium, and yttrium are depleted. Unlike the primary deposits of the San Juan Basin, the deposits in the northeast part of the Church Rock area contain little organic carbon and several elements that are characteristically enriched in the primary deposits are not enriched or are enriched to a much lesser degree in the Church Rock deposits. The suite of elements associated with the deposits in the northeast part of the Church Rock area is also different from the suite of elements associated with the redistributed deposits in the Ambrosia Lake district. This suggests that the genesis of the Church Rock deposits is different, at least in part, from the genesis of the primary deposits of the San Juan Basin or the redistributed deposits at Ambrosia Lake.

  12. Palaeomagnetic analysis on pottery as indicator of the pyroclastic flow deposits temperature: new data and statistical interpretation from the Minoan eruption of Santorini, Greece

    NASA Astrophysics Data System (ADS)

    Tema, E.; Zanella, E.; Pavón-Carrasco, F. J.; Kondopoulou, D.; Pavlides, S.

    2015-10-01

    We present the results of palaeomagnetic analysis on Late Bronge Age pottery from Santorini carried out in order to estimate the thermal effect of the Minoan eruption on the pre-Minoan habitation level. A total of 170 specimens from 108 ceramic fragments have been studied. The ceramics were collected from the surface of the pre-Minoan palaeosol at six different sites, including also samples from the Akrotiri archaeological site. The deposition temperatures of the first pyroclastic products have been estimated by the maximum overlap of the re-heating temperature intervals given by the individual fragments at site level. A new statistical elaboration of the temperature data has also been proposed, calculating at 95 per cent of probability the re-heating temperatures at each site. The obtained results show that the precursor tephra layer and the first pumice fall of the eruption were hot enough to re-heat the underlying ceramics at temperatures 160-230 °C in the non-inhabited sites while the temperatures recorded inside the Akrotiri village are slightly lower, varying from 130 to 200 °C. The decrease of the temperatures registered in the human settlements suggests that there was some interaction between the buildings and the pumice fallout deposits while probably the buildings debris layer caused by the preceding and syn-eruption earthquakes has also contributed to the decrease of the recorded re-heating temperatures.

  13. Interpreting Idioms.

    ERIC Educational Resources Information Center

    Kemper, Susan; Estill, Robert

    A study investigated the immediate comprehension processes involved in the interpretation of English idiomatic expressions. Idioms such as "bury the hatchet" were presented to 48 college students in sentential contexts that either biased the subject toward a literal or a figurative interpretation or left the interpretation ambiguous. In control…

  14. Statistical Analysis and Interpretation of Building Characterization, Indoor Environmental Quality Monitoring and Energy Usage Data from Office Buildings and Classrooms in the United States

    SciTech Connect

    Linda Stetzenbach; Lauren Nemnich; Davor Novosel

    2009-08-31

    Three independent tasks had been performed (Stetzenbach 2008, Stetzenbach 2008b, Stetzenbach 2009) to measure a variety of parameters in normative buildings across the United States. For each of these tasks 10 buildings were selected as normative indoor environments. Task 1 focused on office buildings, Task 13 focused on public schools, and Task 0606 focused on high performance buildings. To perform this task it was necessary to restructure the database for the Indoor Environmental Quality (IEQ) data and the Sound measurement as several issues were identified and resolved prior to and during the transfer of these data sets into SPSS. During overview discussions with the statistician utilized in this task it was determined that because the selection of indoor zones (1-6) was independently selected within each task; zones were not related by location across tasks. Therefore, no comparison would be valid across zones for the 30 buildings so the by location (zone) data were limited to three analysis sets of the buildings within each task. In addition, differences in collection procedures for lighting were used in Task 0606 as compared to Tasks 01 & 13 to improve sample collection. Therefore, these data sets could not be merged and compared so effects by-day data were run separately for Task 0606 and only Task 01 & 13 data were merged. Results of the statistical analysis of the IEQ parameters show statistically significant differences were found among days and zones for all tasks, although no differences were found by-day for Draft Rate data from Task 0606 (p>0.05). Thursday measurements of IEQ parameters were significantly different from Tuesday, and most Wednesday measures for all variables of Tasks 1 & 13. Data for all three days appeared to vary for Operative Temperature, whereas only Tuesday and Thursday differed for Draft Rate 1m. Although no Draft Rate measures within Task 0606 were found to significantly differ by-day, Temperature measurements for Tuesday and

  15. Landslides triggered by the 12 January 2010 Port-au-Prince, Haiti, Mw = 7.0 earthquake: visual interpretation, inventory compiling, and spatial distribution statistical analysis

    NASA Astrophysics Data System (ADS)

    Xu, C.; Shyu, J. B. H.; Xu, X.

    2014-07-01

    The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw= 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and the thicknesses of their erosion with topographic, geologic, and seismic parameters. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolution satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various environmental parameters. These parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several

  16. Landslides triggered by the 12 January 2010 Mw 7.0 Port-au-Prince, Haiti, earthquake: visual interpretation, inventory compiling and spatial distribution statistical analysis

    NASA Astrophysics Data System (ADS)

    Xu, C.; Shyu, J. B. H.; Xu, X.-W.

    2014-02-01

    The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and their erosion thicknesses with topographic factors, seismic parameters, and their distance from roads. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolutions satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various landslide controlling parameters. These controlling parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more

  17. Fractures network analysis and interpretation in carbonate rocks using a multi-criteria statistical approach. Case study of Jebal Chamsi and Jebal Belkhir, South-western part of Tunisia

    NASA Astrophysics Data System (ADS)

    Msaddek, Mohamed Haythem; Moumni, Yahya; Chenini, Ismail; Mercier, Eric; Dlala, Mahmoud

    2016-11-01

    The quantitative analysis of fractures in carbonate rocks across termination folds is important for the understanding of the fractures network distribution and arrangement. In this study, we performed a quantitative analysis and interpretation of fracture network to identify the fracture networks type. For this reason, we used a multi-criteria statistical analysis. The distribution of directional families in all measured stations and their elemental distribution are firstly examined. Then we performed the analysis of directional criteria for each of the two and three neighbouring stations. Finally, the elemental analyses of fracture families crossing others were carried out. This methodology was applied to the folds of Jebal Chamsi and Jebal Belkhir areas located in south western Tunisia characterized by simple folds of carbonate geological formations. The application of the global and the elemental statistical analysis criteria of directional families show a random arrangement of fractures. However, elemental analysis of two and three neighbouring stations for families crossing one another shows a pseudo-organization of fracture arrangements.

  18. Interpretive Experiments

    ERIC Educational Resources Information Center

    DeHaan, Frank, Ed.

    1977-01-01

    Describes an interpretative experiment involving the application of symmetry and temperature-dependent proton and fluorine nmr spectroscopy to the solution of structural and kinetic problems in coordination chemistry. (MLH)

  19. Interpreting Bones.

    ERIC Educational Resources Information Center

    Weymouth, Patricia P.

    1986-01-01

    Describes an activity which introduces students to the nature and challenges of paleoanthropology. In the exercise, students identify diagrammed bones and make interpretations about the creature. Presents questions and tasks employed in the lesson. (ML)

  20. Ambulatory blood pressure monitoring during pregnancy with a new, small, easily concealed monitor.

    PubMed

    Tape, T G; Rayburn, W F; Bremer, K D; Schnoor, T A

    1994-12-01

    Before establishing the utility of ambulatory blood pressure monitoring during pregnancy, we evaluated the accuracy of a small, easily concealed monitor. The 59 normotensive pregnant patients were between 13 and 26 gestational weeks. For each monitor reading, two trained observers independently and simultaneously recorded blood pressures using a mercury manometer connected to the monitor cuff. Seven readings in three positions (sitting upright, semirecumbent, standing) were performed on each patient. Averaged differences between the observers' and monitor readings varied from -2.2 to -0.9 mm Hg (systolic) and from -2.8 to -0.6 (fifth-phase diastolic), indicating slight but clinically unimportant overestimation by the monitor. Correlations between averaged observers' readings and the monitor ranged from 0.79 to 0.92 (systolic) and from 0.85 to 0.92 (fifth-phase diastolic). Overall, the observers agreed with the monitor within 5 mm Hg on 94% of systolic readings and 99% of fifth-phase diastolic readings. There was no statistically significant difference in accuracy with changes in body position. We conclude that this small, quiet, noninvasive device accurately determined blood pressures during pregnancy.

  1. Predictive model for delayed graft function based on easily available pre-renal transplant variables.

    PubMed

    Zaza, Gianluigi; Ferraro, Pietro Manuel; Tessari, Gianpaolo; Sandrini, Silvio; Scolari, Maria Piera; Capelli, Irene; Minetti, Enrico; Gesualdo, Loreto; Girolomoni, Giampiero; Gambaro, Giovanni; Lupo, Antonio; Boschiero, Luigino

    2015-03-01

    Identification of pre-transplant factors influencing delayed graft function (DGF) could have an important clinical impact. This could allow clinicians to early identify dialyzed chronic kidney disease (CKD) patients eligible for special transplant programs, preventive therapeutic strategies and specific post-transplant immunosuppressive treatments. To achieve these objectives, we retrospectively analyzed main demographic and clinical features, follow-up events and outcomes registered in a large dedicated dataset including 2,755 patients compiled collaboratively by four Italian renal/transplant units. The years of transplant ranged from 1984 to 2012. Statistical analysis clearly demonstrated that some recipients' characteristics at the time of transplantation (age and body weight) and dialysis-related variables (modality and duration) were significantly associated with DGF development (p ≤ 0.001). The area under the receiver-operating characteristic (ROC) curve of the final model based on the four identified variables predicting DGF was 0.63 (95 % CI 0.61, 0.65). Additionally, deciles of the score were significantly associated with the incidence of DGF (p value for trend <0.001). Therefore, in conclusion, in our study we identified a pre-operative predictive model for DGF, based on inexpensive and easily available variables, potentially useful in routine clinical practice in most of the Italian and European dialysis units.

  2. Interpreting Association from Graphical Displays

    ERIC Educational Resources Information Center

    Fitzallen, Noleine

    2016-01-01

    Research that has explored students' interpretations of graphical representations has not extended to include how students apply understanding of particular statistical concepts related to one graphical representation to interpret different representations. This paper reports on the way in which students' understanding of covariation, evidenced…

  3. Interpreting Evidence.

    ERIC Educational Resources Information Center

    Munsart, Craig A.

    1993-01-01

    Presents an activity that allows students to experience the type of discovery process that paleontologists necessarily followed during the early dinosaur explorations. Students are read parts of a story taken from the "American Journal of Science" and interpret the evidence leading to the discovery of Triceratops and Stegosaurus. (PR)

  4. Performing Interpretation

    ERIC Educational Resources Information Center

    Kothe, Elsa Lenz; Berard, Marie-France

    2013-01-01

    Utilizing a/r/tographic methodology to interrogate interpretive acts in museums, multiple areas of inquiry are raised in this paper, including: which knowledge is assigned the greatest value when preparing a gallery talk; what lies outside of disciplinary knowledge; how invitations to participate invite and disinvite in the same gesture; and what…

  5. Interpreting Metonymy.

    ERIC Educational Resources Information Center

    Pankhurst, Anne

    1994-01-01

    This paper examines some of the problems associated with interpreting metonymy, a figure of speech in which an attribute or commonly associated feature is used to name or designate something. After defining metonymy and outlining the principles of metonymy, the paper explains the differences between metonymy, synecdoche, and metaphor. It is…

  6. SLAR image interpretation keys for geographic analysis

    NASA Technical Reports Server (NTRS)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  7. Interpretive Medicine

    PubMed Central

    Reeve, Joanne

    2010-01-01

    Patient-centredness is a core value of general practice; it is defined as the interpersonal processes that support the holistic care of individuals. To date, efforts to demonstrate their relationship to patient outcomes have been disappointing, whilst some studies suggest values may be more rhetoric than reality. Contextual issues influence the quality of patient-centred consultations, impacting on outcomes. The legitimate use of knowledge, or evidence, is a defining aspect of modern practice, and has implications for patient-centredness. Based on a critical review of the literature, on my own empirical research, and on reflections from my clinical practice, I critique current models of the use of knowledge in supporting individualised care. Evidence-Based Medicine (EBM), and its implementation within health policy as Scientific Bureaucratic Medicine (SBM), define best evidence in terms of an epistemological emphasis on scientific knowledge over clinical experience. It provides objective knowledge of disease, including quantitative estimates of the certainty of that knowledge. Whilst arguably appropriate for secondary care, involving episodic care of selected populations referred in for specialist diagnosis and treatment of disease, application to general practice can be questioned given the complex, dynamic and uncertain nature of much of the illness that is treated. I propose that general practice is better described by a model of Interpretive Medicine (IM): the critical, thoughtful, professional use of an appropriate range of knowledges in the dynamic, shared exploration and interpretation of individual illness experience, in order to support the creative capacity of individuals in maintaining their daily lives. Whilst the generation of interpreted knowledge is an essential part of daily general practice, the profession does not have an adequate framework by which this activity can be externally judged to have been done well. Drawing on theory related to the

  8. Statistical Inference: The Big Picture.

    PubMed

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  9. Inventory Control: An Inexpensive and Easily Constructed Device for Quantitative Conductivity Experiments.

    ERIC Educational Resources Information Center

    Rettich, Timothy R.; Battino, Rubin

    1989-01-01

    Presents a low cost system with easily replaced electrodes for use in general chemistry. Notes the accuracy and wide applicability permit easy use in physical or quantitative chemistry experiments. Provides schematic, theory, and helpful suggestions. (MVL)

  10. An Easily Constructed Model of Twin Octahedrons Having a Common Line.

    ERIC Educational Resources Information Center

    Yamana, Shukichi; Kawaguchi, Makoto

    1984-01-01

    A model of twin octahedrons having a common line which is useful for teaching stereochemistry (especially that of complex ions) can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  11. LED champing: statistically blessed?

    PubMed

    Wang, Zhuo

    2015-06-10

    LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions.

  12. Statistical Neurodynamics.

    NASA Astrophysics Data System (ADS)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

  13. The Validity of Seven Easily Obtainable Economic and Demographic Predictors of Achievement Test Performance.

    ERIC Educational Resources Information Center

    May, Robert J., Jr.; And Others

    1978-01-01

    Seven easily obtainable background variables, such as number of persons, rooms, or cars per family dwelling; kindergarten attendance; and sex were found to have a multiple correlation of .52 with a standard achievement test for a large sample of fourth grade pupils in a metropolitan school district. (JKS)

  14. Invention Activities Support Statistical Reasoning

    ERIC Educational Resources Information Center

    Smith, Carmen Petrick; Kenlan, Kris

    2016-01-01

    Students' experiences with statistics and data analysis in middle school are often limited to little more than making and interpreting graphs. Although students may develop fluency in statistical procedures and vocabulary, they frequently lack the skills necessary to apply statistical reasoning in situations other than clear-cut textbook examples.…

  15. Automatic interpretation of digital maps

    NASA Astrophysics Data System (ADS)

    Walter, Volker; Luo, Fen

    In the past, the availability and/or the acquisition of spatial data were often the main problems of the realization of spatial applications. Meanwhile this situation has changed: on one hand, comprehensive spatial datasets already exist and on the other hand, new sensor technologies have the ability to capture fast and with high quality large amounts of spatial data. More and more responsible for the increasing accessibility of spatial data are also collaborative mapping techniques which enable users to create maps by themselves and to make them available in the internet. However, the potential of this diversity of spatial data can only hardly be utilized. Especially maps in the internet are represented very often only with graphical elements and no explicit information about the map's scale, extension and content is available. Nevertheless, humans are able to extract this information and to interpret maps. For example, it is possible for a human to distinguish between rural and industrial areas only by looking at the objects' geometries. Furthermore, a human can easily identify and group map objects that belong together. Also the type, scale and extension of a map can be identified under certain conditions only by looking at the objects' geometries. All these examples can be subsumed under the term "map interpretation". In this paper it is discussed how map interpretation can be automated and how automatic map interpretation can be used in order to support other processes. The different kinds of automatic map interpretation are discussed and two approaches are shown in detail.

  16. Semantic Interpretation of An Artificial Neural Network

    DTIC Science & Technology

    1995-12-01

    success for stock market analysis/prediction is artificial neural networks. However, knowledge embedded in the neural network is not easily translated...interpret neural network knowledge. The first, called Knowledge Math, extends the use of connection weights, generating rules for general (i.e. non-binary

  17. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-07-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data to statistical methods and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any data set easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a data set is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology, we chose two large data sets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose the incorporation of this analysis into future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  18. Statistical Reform in School Psychology Research: A Synthesis

    ERIC Educational Resources Information Center

    Swaminathan, Hariharan; Rogers, H. Jane

    2007-01-01

    Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.

  19. Synthesis, Characterization, to application of water soluble and easily removable cationic pressure sensitive adhesives

    SciTech Connect

    Institute of Paper Science Technology

    2004-01-30

    In recent years, the world has expressed an increasing interest in the recycling of waste paper to supplement the use of virgin fiber as a way to protect the environment. Statistics show that major countries are increasing their use of recycled paper. For example, in 1991 to 1996, the U.S. increased its recovered paper utilization rate from 31% to 39%, Germany went from 50% to 60%, the UK went from 60% to 70%, France increased from 46% to 49%, and China went from 32% to 35% [1]. As recycled fiber levels and water system closures both increase, recycled product quality will need to improve in order for recycled products to compete with products made from virgin fiber [2]. The use of recycled fiber has introduced an increasing level of metal, plastic, and adhesive contamination into the papermaking process which has added to the complexity of the already overwhelming task of providing a uniform and clean recycle furnish. The most harmful of these contaminates is a mixture of adhesives and polymeric substances that are commonly known as stickies. Stickies, which enter the mill with the pulp furnish, are not easily removed from the repulper and become more difficult the further down the system they get. This can be detrimental to the final product quality. Stickies are hydrophobic, tacky, polymeric materials that are introduced into the papermaking system from a mixture of recycled fiber sources. Properties of stickies are very similar to the fibers used in papermaking, viz. size, density, hydrophobicity, and electrokinetic charge. This reduces the probability of their removal by conventional separation processes, such as screening and cleaning, which are based on such properties. Also, their physical and chemical structure allows for them to extrude through screens, attach to fibers, process equipment, wires and felts. Stickies can break down and then reagglomerate and appear at seemingly any place in the mill. When subjected to a number of factors including changes

  20. Summary and interpretive synthesis

    SciTech Connect

    1995-05-01

    This chapter summarizes the major advances made through our integrated geological studies of the Lisburne Group in northern Alaska. The depositional history of the Lisburne Group is discussed in a framework of depositional sequence stratigraphy. Although individual parasequences (small-scale carbonate cycles) of the Wahoo Limestone cannot be correlated with certainty, parasequence sets can be interpreted as different systems tracts within the large-scale depositional sequences, providing insights on the paleoenvironments, paleogeography and platform geometry. Conodont biostratigraphy precisely established the position of the Mississippian-Pennsylvanian boundary within an important reference section, where established foraminiferal biostratigraphy is inconsistent with respect to conodont-based time-rock boundaries. However, existing Carboniferous conodont zonations are not readily applicable because most zonal indicators are absent, so a local zonation scheme was developed. Diagenetic studies of the Lisburne Group recognized nineteen subaerial exposure surfaces and developed a cement stratigraphy that includes: early cements associated with subaerial exposure surfaces in the Lisburne Group; cements associated with the sub-Permian unconformity; and later burial cements. Subaerial exposure surfaces in the Alapah Limestone are easily explained, being associated with peritidal environments at the boundaries of Sequence A. The Lisburne exposed in ANWR is generally tightly cemented and supermature, but could still be a good reservoir target in the adjacent subsurface of ANWR given the appropriate diagenetic, deformational and thermal history. Our ongoing research on the Lisburne Group will hopefully provide additional insights in future publications.

  1. Hypothesis Formulation, Model Interpretation, and Model Equivalence: Implications of a Mereological Causal Interpretation of Structural Equation Models

    ERIC Educational Resources Information Center

    Markus, Keith A.

    2008-01-01

    One can distinguish statistical models used in causal modeling from the causal interpretations that align them with substantive hypotheses. Causal modeling typically assumes an efficient causal interpretation of the statistical model. Causal modeling can also make use of mereological causal interpretations in which the state of the parts…

  2. Spider phobics more easily see a spider in morphed schematic pictures

    PubMed Central

    Kolassa, Iris-Tatjana; Buchmann, Arlette; Lauche, Romy; Kolassa, Stephan; Partchev, Ivailo; Miltner, Wolfgang HR; Musial, Frauke

    2007-01-01

    Background Individuals with social phobia are more likely to misinterpret ambiguous social situations as more threatening, i.e. they show an interpretive bias. This study investigated whether such a bias also exists in specific phobia. Methods Individuals with spider phobia or social phobia, spider aficionados and non-phobic controls saw morphed stimuli that gradually transformed from a schematic picture of a flower into a schematic picture of a spider by shifting the outlines of the petals until they turned into spider legs. Participants' task was to decide whether each stimulus was more similar to a spider, a flower or to neither object while EEG was recorded. Results An interpretive bias was found in spider phobia on a behavioral level: with the first opening of the petals of the flower anchor, spider phobics rated the stimuli as more unpleasant and arousing than the control groups and showed an elevated latent trait to classify a stimulus as a spider and a response-time advantage for spider-like stimuli. No cortical correlates on the level of ERPs of this interpretive bias could be identified. However, consistent with previous studies, social and spider phobic persons exhibited generally enhanced visual P1 amplitudes indicative of hypervigilance in phobia. Conclusion Results suggest an interpretive bias and generalization of phobia-specific responses in specific phobia. Similar effects have been observed in other anxiety disorders, such as social phobia and posttraumatic stress disorder. PMID:18021433

  3. Morphological Characterization of a New and Easily Recognizable Nuclear Male Sterile Mutant of Sorghum (Sorghum bicolor)

    PubMed Central

    Xin, Zhanguo; Huang, Jian; Smith, Ashley R.; Chen, Junping; Burke, John; Sattler, Scott E.

    2017-01-01

    Sorghum (Sorghum bicolor L. Moench) is one of the most important grain crops in the world. The nuclear male sterility (NMS) trait, which is caused by mutations on the nuclear gene, is valuable for hybrid breeding and genetic studies. Several NMS mutants have been reported previously, but none of them were well characterized. Here, we present our detailed morphological characterization of a new and easily recognizable NMS sorghum mutant male sterile 8 (ms8) isolated from an elite inbred BTx623 mutagenized by ethyl methane sulfonate (EMS). Our results show that the ms8 mutant phenotype was caused by a mutation on a single recessive nuclear gene that is different from all available NMS loci reported in sorghum. In fertile sorghum plants, yellow anthers appeared first during anthesis, while in the ms8 mutant, white hairy stigma emerged first and only small white anthers were observed, making ms8 plants easily recognizable when flowering. The ovary development and seed production after manual pollination are normal in the ms8 mutant, indicating it is female fertile and male sterile only. We found that ms8 anthers did not produce pollen grains. Further analysis revealed that ms8 anthers were defective in tapetum development, which led to the arrest of pollen formation. As a stable male sterile mutant across different environments, greenhouses, and fields in different locations, the ms8 mutant could be a useful breeding tool. Moreover, ms8 might be an important for elucidating male gametophyte development in sorghum and other plants. PMID:28052078

  4. Descriptive statistics.

    PubMed

    Shi, Runhua; McLarty, Jerry W

    2009-10-01

    In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications.

  5. Statistical Software.

    ERIC Educational Resources Information Center

    Callamaras, Peter

    1983-01-01

    This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)

  6. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.

  7. Interpreting correlations in biosequences

    NASA Astrophysics Data System (ADS)

    Herzel, H.; Trifonov, E. N.; Weiss, O.; Große, I.

    Understanding the complex organization of genomes as well as predicting the location of genes and the possible structure of the gene products are some of the most important problems in current molecular biology. Many statistical techniques are used to address these issues. A central role among them play correlation functions. This paper is based on an analysis of the decay of the entire 4×4 dimensional covariance matrix of DNA sequences. We apply this covariance analysis to human chromosomal regions, yeast DNA, and bacterial genomes and interpret the three most pronounced statistical features - long-range correlations, a period 3, and a period 10-11 - using known biological facts about the structure of genomes. For example, we relate the slowly decaying long-range G+C correlations to dispersed repeats and CpG islands. We show quantitatively that the 3-basepair-periodicity is due to the nonuniformity of the codon usage in protein coding segments. We finally show that periodicities of 10-11 basepairs in yeast DNA originate from an alternation of hydrophobic and hydrophilic amino acids in protein sequences.

  8. A 2D zinc-organic network being easily exfoliated into isolated sheets

    NASA Astrophysics Data System (ADS)

    Yu, Guihong; Li, Ruiqing; Leng, Zhihua; Gan, Shucai

    2016-08-01

    A metal-organic aggregate, namely {Zn2Cl2(BBC)}n (BBC = 4,4‧,4‧‧-(benzene-1,3,5-triyl-tris(benzene-4,1-diyl))tribenzoate) was obtained by solvothermal synthesis. Its structure is featured with the Zn2(COO)3 paddle-wheels with two chloride anions on axial positions and hexagonal pores in the layers. The exclusion of water in the precursor and the solvent plays a crucial role in the formation of target compound. This compound can be easily dissolved in alkaline solution and exfoliated into isolated sheets, which shows a novel way for the preparation of 2D materials.

  9. Solving block linear systems with low-rank off-diagonal blocks is easily parallelizable

    SciTech Connect

    Menkov, V.

    1996-12-31

    An easily and efficiently parallelizable direct method is given for solving a block linear system Bx = y, where B = D + Q is the sum of a non-singular block diagonal matrix D and a matrix Q with low-rank blocks. This implicitly defines a new preconditioning method with an operation count close to the cost of calculating a matrix-vector product Qw for some w, plus at most twice the cost of calculating Qw for some w. When implemented on a parallel machine the processor utilization can be as good as that of those operations. Order estimates are given for the general case, and an implementation is compared to block SSOR preconditioning.

  10. Easily processable multimodal spectral converters based on metal oxide/organic-inorganic hybrid nanocomposites.

    PubMed

    Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P; Freitas, Vânia T; André, Paulo S; Carlos, Luis D; Ferreira, Rute A S

    2015-10-09

    This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er(3+), Yb(3+) codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er(3+)- and Yb(3+)-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices.

  11. Easily processable multimodal spectral converters based on metal oxide/organic—inorganic hybrid nanocomposites

    NASA Astrophysics Data System (ADS)

    Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P.; Freitas, Vânia T.; André, Paulo S.; Carlos, Luis D.; Ferreira, Rute A. S.

    2015-10-01

    This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er3+, Yb3+ codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er3+- and Yb3+-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices.

  12. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  13. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-03-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used characteristics to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any dataset easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a dataset is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology we chose two large datasets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose add the use of this analysis for future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  14. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.

  15. GoCxx: a tool to easily leverage C++ legacy code for multicore-friendly Go libraries and frameworks

    NASA Astrophysics Data System (ADS)

    Binet, Sébastien

    2012-12-01

    Current HENP libraries and frameworks were written before multicore systems became widely deployed and used. From this environment, a ‘single-thread’ processing model naturally emerged but the implicit assumptions it encouraged are greatly impairing our abilities to scale in a multicore/manycore world. Writing scalable code in C++ for multicore architectures, while doable, is no panacea. Sure, C++11 will improve on the current situation (by standardizing on std::thread, introducing lambda functions and defining a memory model) but it will do so at the price of complicating further an already quite sophisticated language. This level of sophistication has probably already strongly motivated analysis groups to migrate to CPython, hoping for its current limitations with respect to multicore scalability to be either lifted (Grand Interpreter Lock removal) or for the advent of a new Python VM better tailored for this kind of environment (PyPy, Jython, …) Could HENP migrate to a language with none of the deficiencies of C++ (build time, deployment, low level tools for concurrency) and with the fast turn-around time, simplicity and ease of coding of Python? This paper will try to make the case for Go - a young open source language with built-in facilities to easily express and expose concurrency - being such a language. We introduce GoCxx, a tool leveraging gcc-xml's output to automatize the tedious work of creating Go wrappers for foreign languages, a critical task for any language wishing to leverage legacy and field-tested code. We will conclude with the first results of applying GoCxx to real C++ code.

  16. Interpretation of Confidence Interval Facing the Conflict

    ERIC Educational Resources Information Center

    Andrade, Luisa; Fernández, Felipe

    2016-01-01

    As literature has reported, it is usual that university students in statistics courses, and even statistics teachers, interpret the confidence level associated with a confidence interval as the probability that the parameter value will be between the lower and upper interval limits. To confront this misconception, class activities have been…

  17. Shaft seals with an easily removable cylinder holder for low-pressure steam turbines

    NASA Astrophysics Data System (ADS)

    Zakharov, A. E.; Rodionov, D. A.; Pimenov, E. V.; Sobolev, A. S.

    2016-01-01

    The article is devoted to the problems that occur at the operation of LPC shaft seals (SS) of turbines, particularly, their bearings. The problems arising from the deterioration of oil-protecting rings of SS and bearings and also the consequences in which they can result are considered. The existing SS housing construction types are considered. Their operational features are specified. A new SS construction type with an easily removable holder is presented. The construction of its main elements is described. The sequence of operations of the repair personnel at the restoration of the new SS type spacings is proposed. The comparative analysis of the new and the existing SS construction types is carried out. The assessment results of the efficiency, the operational convenience, and the economic effect after the installation of the new type seals are given. The conclusions about the offered construction prospects are made by results of the comparative analysis and the carried-out assessment. The main advantage of this design is the possibility of spacings restoration both in SS and in oil-protecting rings during a short-term stop of a turbine, even without its cooling. This construction was successfully tested on the working K-300-23.5 LMP turbine. However, its adaptation for other turbines is quite possible.

  18. Solar-assisted photodegradation of isoproturon over easily recoverable titania catalysts.

    PubMed

    Tolosana-Moranchel, A; Carbajo, J; Faraldos, M; Bahamonde, A

    2017-01-27

    An easily recoverable homemade TiO2 catalyst (GICA-1) has been evaluated during the overall photodegradation process, understood as photocatalytic efficiency and catalyst recovery step, in the solar light-assisted photodegradation of isoproturon and its reuse in two consecutive cycles. The global feasibility has been compared to the commercial TiO2 P25. The homemade GICA-1 catalyst presented better sedimentation efficiency than TiO2 P25 at all studied pHs, which could be explained by its higher average hydrodynamic particle size (3 μm) and other physicochemical surface properties. The evaluation of the overall process (isoproturon photo-oxidation + catalyst recovery) revealed GICA-1 homemade titania catalyst strengths: total removal of isoproturon in less than 60 min, easy recovery by sedimentation, and reusability in two consecutive cycles, without any loss of photocatalytic efficiency. Therefore, considering the whole photocatalytic cycle (good performance in photodegradation plus catalyst recovery step), the homemade GICA-1 photocatalyst resulted in more affordability than commercial TiO2 P25. Graphical abstract.

  19. Measurement of thermal properties of white radish (R. raphanistrum) using easily constructed probes

    PubMed Central

    Obot, Mfrekemfon Samuel; Li, Changcheng; Fang, Ting; Chen, Jinquan

    2017-01-01

    Thermal properties are necessary for the design and control of processes and storage facilities of food materials. This study proposes the measurement of thermal properties using easily constructed probes with specific heat capacity calculated, as opposed to the use of Differential Scanning Calorimeter (DSC) or other. These probes were constructed and used to measure thermal properties of white radish in the temperature range of 80–20°C and moisture content of 91–6.1% wb. Results showed thermal properties were within the range of 0.71–0.111 Wm-1 C-1 for thermal conductivity, 1.869×10−7–0.72×10−8 m2s-1 for thermal diffusivity and 4.316–1.977 kJ kg-1C-1for specific heat capacity. These results agree with reports for similar products studied using DSC and commercially available line heat source probes. Empirical models were developed for each property through linear multiple regressions. The data generated would be useful in modeling and control of its processing and equipment design. PMID:28288175

  20. Efficient transformation of grease to biodiesel using highly active and easily recyclable magnetic nanobiocatalyst aggregates.

    PubMed

    Ngo, Thao P N; Li, Aitao; Tiew, Kang W; Li, Zhi

    2013-10-01

    Green and efficient production of biodiesel (FAME) from waste grease containing high amount of free fatty acid (FFA) was achieved by using novel magnetic nanobiocatalyst aggregates (MNA). Thermomyces lanuginosus Lipase (TLL) and Candida antarctica Lipase B (CALB) were covalently immobilized on core-shell structured iron oxide magnetic nanoparticle (80 nm), respectively, followed by freeze-dry to give MNA (13-17 μm) with high yield (80-89%) and high enzyme loading (61 mg TLL or 22 mg CALB per gram MNA). MNA TL showed the best performance among immobilized enzymes known thus for the production of FAME from grease (17 wt.% FFA) with methanol, giving 99% yield in 12 h (3.3 wt.% catalyst). MNA TL was easily separated under magnetic field and reused, retaining 88% productivity in 11th cycle. MNA CA converted >97% FFA in grease (17 wt.% FFA) to FAME in 12 h (0.45 wt.% catalyst), being useful in two-step transformation of grease to biodiesel.

  1. Easily regenerable solid adsorbents based on polyamines for carbon dioxide capture from the air.

    PubMed

    Goeppert, Alain; Zhang, Hang; Czaun, Miklos; May, Robert B; Prakash, G K Surya; Olah, George A; Narayanan, S R

    2014-05-01

    Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle.

  2. Easily separated silver nanoparticle-decorated magnetic graphene oxide: Synthesis and high antibacterial activity.

    PubMed

    Zhang, Huai-Zhi; Zhang, Chang; Zeng, Guang-Ming; Gong, Ji-Lai; Ou, Xiao-Ming; Huan, Shuang-Yan

    2016-06-01

    Silver nanoparticle-decorated magnetic graphene oxide (MGO-Ag) was synthesized by doping silver and Fe3O4 nanoparticles on the surface of GO, which was used as an antibacterial agent. MGO-Ag was characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), Energy dispersive X-ray (EDS), X-ray diffraction (XRD), Raman spectroscopy and magnetic property tests. It can be found that magnetic iron oxide nanoparticles and nano-Ag was well dispersed on graphene oxide; and MGO-Ag exhibited excellent antibacterial activity against Escherichia coli and Staphylococcus aureus. Several factors were investigated to study the antibacterial effect of MGO-Ag, such as temperature, time, pH and bacterial concentration. We also found that MGO-Ag maintained high inactivation rates after use six times and can be separated easily after antibacterial process. Moreover, the antibacterial mechanism is discussed and the synergistic effect of GO, Fe3O4 nanoparticles and nano-Ag accounted for high inactivation of MGO-Ag.

  3. An easily reversible structural change underlies mechanisms enabling desert crust cyanobacteria to survive desiccation.

    PubMed

    Bar-Eyal, Leeat; Eisenberg, Ido; Faust, Adam; Raanan, Hagai; Nevo, Reinat; Rappaport, Fabrice; Krieger-Liszkay, Anja; Sétif, Pierre; Thurotte, Adrien; Reich, Ziv; Kaplan, Aaron; Ohad, Itzhak; Paltiel, Yossi; Keren, Nir

    2015-10-01

    Biological desert sand crusts are the foundation of desert ecosystems, stabilizing the sands and allowing colonization by higher order organisms. The first colonizers of the desert sands are cyanobacteria. Facing the harsh conditions of the desert, these organisms must withstand frequent desiccation-hydration cycles, combined with high light intensities. Here, we characterize structural and functional modifications to the photosynthetic apparatus that enable a cyanobacterium, Leptolyngbya sp., to thrive under these conditions. Using multiple in vivo spectroscopic and imaging techniques, we identified two complementary mechanisms for dissipating absorbed energy in the desiccated state. The first mechanism involves the reorganization of the phycobilisome antenna system, increasing excitonic coupling between antenna components. This provides better energy dissipation in the antenna rather than directed exciton transfer to the reaction center. The second mechanism is driven by constriction of the thylakoid lumen which limits diffusion of plastocyanin to P700. The accumulation of P700(+) not only prevents light-induced charge separation but also efficiently quenches excitation energy. These protection mechanisms employ existing components of the photosynthetic apparatus, forming two distinct functional modes. Small changes in the structure of the thylakoid membranes are sufficient for quenching of all absorbed energy in the desiccated state, protecting the photosynthetic apparatus from photoinhibitory damage. These changes can be easily reversed upon rehydration, returning the system to its high photosynthetic quantum efficiency.

  4. Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.

    PubMed

    Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang

    2016-11-01

    Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types.

  5. Easily Regenerable Solid Adsorbents Based on Polyamines for Carbon Dioxide Capture from the Air

    SciTech Connect

    Goeppert, A; Zhang, H; Czaun, M; May, RB; Prakash, GKS; Olah, GA; Narayanan, SR

    2014-03-18

    Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle.

  6. 11C=O Bonds Made Easily for Positron Emission Tomography Radiopharmaceuticals

    PubMed Central

    Rotstein, Benjamin H.; Liang, Steven H.; Placzek, Michael S.; Hooker, Jacob M.; Gee, Antony D.; Dollé, Frédéric; Wilson, Alan A.; Vasdev, Neil

    2016-01-01

    The positron-emitting radionuclide carbon-11 (11C, t1/2 = 20.3 minutes) possesses the unique potential for radiolabeling of any biological, naturally occurring, or synthetic organic molecule for in vivo positron emission tomography (PET) imaging. Carbon-11 is most often incorporated into small molecules by methylation of alcohol, thiol, amine or carboxylic acid precursors using [11C]methyl iodide or [11C]methyl triflate (generated from [11C]CO2). Consequently, small molecules that lack an easily substituted 11C-methyl group are often considered to have non-obvious strategies for radiolabeling and require a more customized approach. [11C]Carbon dioxide, [11C]carbon monoxide, [11C]cyanide, and [11C]phosgene represent alternative carbon-11 reactants to enable 11C-carbonylation. Methodologies developed for preparation of 11C-carbonyl groups have had a tremendous impact on the development of novel PET radiopharmaceuticals and provided key tools for clinical research. 11C-Carbonyl radiopharmaceuticals based on labeled carboxylic acids, amides, carbamates, and ureas now account for a substantial number of important imaging agents that have seen translation to higher species and clinical research of previously inaccessible targets, which is a testament to the creativity, utility, and practicality of the underlying radiochemistry. PMID:27276357

  7. [Sixty years ago, cell cultures finally permitted the poliomyelitis virus to multiply easily].

    PubMed

    Chastel, Claude

    2009-01-01

    In 1949, three American virologists, John F. Enders, Thomas H. Weller and Frederick C. Robbins, from the Harvard Medical Scholl and working at the Children's Medical Centre, Boston, Mass., have provoked a true revolution in Virology. Here, they have succeeded in readily multiplying the three poliomyelitis viruses in vitro, in non-nervous cells cultures. A few years afterwards (1954), they were collectively honoured by the Nobel Prize of Physiology and Medicine. This discovery not only has quickly led to the production of efficient poliomyelitis vaccines (J. E. Salk, 1953; A. B. Sabin, 1955) but also has permitted to easily isolate a number of already known viruses (measles, rubella, mumps, herpes simplex and herpes zoster) or until then totally unknown viruses (adenovirus, echovirus, cytomegalovirus). These progresses have significantly contributed to improve diagnosis, sanitary surveillance and vaccinal prophylaxis of human and animal viral diseases. Moreover, the cells cultures techniques have also benefited to other domains of fundamental Biology, such as cellular biology, genetics, cancerology, biology of the reproduction and regenerative medicine as well.

  8. Ectopic spleen: An easily identifiable but commonly undiagnosed entity until manifestation of complications

    PubMed Central

    Blouhos, Konstantinos; Boulas, Konstantinos A.; Salpigktidis, Ilias; Barettas, Nikolaos; Hatzigeorgiadis, Anestis

    2014-01-01

    INTRODUCTION Ectopic spleen is an uncommon clinical entity as splenectomy for treatment of ectopic spleens accounts for less than 0.25% of splenectomies. The most common age of presentation is childhood especially under 1 year of age followed by the third decade of life. PRESENTATION OF CASE The present report refers to a patient with torsion of a pelvic spleen treated with splenectomy. The patient exhibited a period of vague intermittent lower abdominal pain lasted 65 days followed by a period of constant left lower quadrant pain of increasing severity lasted 6 days. On the first 65 days, vague pain was attributed to progressive torsion of the spleen which resulted in venous congestion. On the last 6 days, exacerbation of pain was attributed to irreducible torsion, infraction of the arterial supply, acute ischemia, strangulation and rupture of the gangrenous spleen. Diagnosis was made by CT which revealed absence of the spleen in its normal position, a homogeneous pelvic mass with no contrast enhancement, free blood in the peritoneal cavity, and confirmed by laparotomy. DISCUSSION Clinical manifestations of ectopic spleen vary from asymptomatic to abdominal emergency. Symptoms are most commonly attributed to complications related to torsion. Operative management, including splenopexy or splenectomy, is the treatment of choice in uncomplicated and complicated cases because conservative treatment of an asymptomatic ectopic spleen is associated with a complication rate of 65%. CONCLUSION Although an ectopic spleen can be easily identified on clinical examination, it is commonly misdiagnosed until the manifestation of complications in adulthood. PMID:24973525

  9. Open Window: When Easily Identifiable Genomes and Traits Are in the Public Domain

    PubMed Central

    Angrist, Misha

    2014-01-01

    “One can't be of an enquiring and experimental nature, and still be very sensible.” - Charles Fort [1] As the costs of personal genetic testing “self-quantification” fall, publicly accessible databases housing people's genotypic and phenotypic information are gradually increasing in number and scope. The latest entrant is openSNP, which allows participants to upload their personal genetic/genomic and self-reported phenotypic data. I believe the emergence of such open repositories of human biological data is a natural reflection of inquisitive and digitally literate people's desires to make genomic and phenotypic information more easily available to a community beyond the research establishment. Such unfettered databases hold the promise of contributing mightily to science, science education and medicine. That said, in an age of increasingly widespread governmental and corporate surveillance, we would do well to be mindful that genomic DNA is uniquely identifying. Participants in open biological databases are engaged in a real-time experiment whose outcome is unknown. PMID:24647311

  10. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  11. [Is an optimistic memory less easily influenced by negative than by positive emotions?].

    PubMed

    Beneyto Molina, Vicent Blai; Fernández-Abascal, Enrique García

    2012-05-01

    This work examines whether a positive personality trait, such as optimism, can reduce bias in differential words recalled after inducing a certain emotion. After showing a list of words with various emotional valences to a group of 59 subjects, a specific emotional state was induced. Subsequently, the subjects were asked to recall the list of words. The results obtained indicated that less optimistic subjects had a tendency to recall and recognize a greater number of negative words when in a negative emotional condition. Statistical significance was reached in the female group's negative word recognition when experiencing negative emotion.

  12. The Sclerotic Scatter Limbal Arc Is More Easily Elicited under Mesopic Rather Than Photopic Conditions

    PubMed Central

    Denion, Eric; Lux, Anne-Laure; Mouriaux, Frédéric; Béraud, Guillaume

    2016-01-01

    Introduction We aimed to determine the limbal lighting illuminance thresholds (LLITs) required to trigger perception of sclerotic scatter at the opposite non-illuminated limbus (i.e. perception of a light limbal scleral arc) under different levels of ambient lighting illuminance (ALI). Material and Methods Twenty healthy volunteers were enrolled. The iris shade (light or dark) was graded by retrieving the median value of the pixels of a pre-determined zone of a gray-level iris photograph. Mean keratometry and central corneal pachymetry were recorded. Each subject was asked to lie down, and the ALI at eye level was set to mesopic values (10, 20, 40 lux), then photopic values (60, 80, 100, 150, 200 lux). For each ALI level, a light beam of gradually increasing illuminance was applied to the right temporal limbus until the LLIT was reached, i.e. the level required to produce the faint light arc that is characteristic of sclerotic scatter at the nasal limbus. Results After log-log transformation, a linear relationship between the logarithm of ALI and the logarithm of the LLIT was found (p<0.001), a 10% increase in ALI being associated with an average increase in the LLIT of 28.9%. Higher keratometry values were associated with higher LLIT values (p = 0.008) under low ALI levels, but the coefficient of the interaction was very small, representing a very limited effect. Iris shade and central corneal thickness values were not significantly associated with the LLIT. We also developed a censored linear model for ALI values ≤ 40 lux, showing a linear relationship between ALI and the LLIT, in which the LLIT value was 34.4 times greater than the ALI value. Conclusion Sclerotic scatter is more easily elicited under mesopic conditions than under photopic conditions and requires the LLIT value to be much higher than the ALI value, i.e. it requires extreme contrast. PMID:26964096

  13. Clearly written, easily comprehended? The readability of websites providing information on epilepsy.

    PubMed

    Brigo, Francesco; Otte, Willem M; Igwe, Stanley C; Tezzon, Frediano; Nardone, Raffaele

    2015-03-01

    There is a general need for high-quality, easily accessible, and comprehensive health-care information on epilepsy to better inform the general population about this highly stigmatized neurological disorder. The aim of this study was to evaluate the health literacy level of eight popular English-written websites that provide information on epilepsy in quantitative terms of readability. Educational epilepsy material on these websites, including 41 Wikipedia articles, were analyzed for their overall level of readability and the corresponding academic grade level needed to comprehend the published texts on the first reading. The Flesch Reading Ease (FRE) was used to assess ease of comprehension while the Gunning Fog Index, Coleman-Liau Index, Flesch-Kincaid Grade Level, Automated Readability Index, and Simple Measure of Gobbledygook scales estimated the corresponding academic grade level needed for comprehension. The average readability of websites yielded results indicative of a difficult-to-fairly-difficult readability level (FRE results: 44.0±8.2), with text readability corresponding to an 11th academic grade level (11.3±1.9). The average FRE score of the Wikipedia articles was indicative of a difficult readability level (25.6±9.5), with the other readability scales yielding results corresponding to a 14th grade level (14.3±1.7). Popular websites providing information on epilepsy, including Wikipedia, often demonstrate a low level of readability. This can be ameliorated by increasing access to clear and concise online information on epilepsy and health in general. Short "basic" summaries targeted to patients and nonmedical users should be added to articles published in specialist websites and Wikipedia to ease readability.

  14. Superomniphobic and easily repairable coatings on copper substrates based on simple immersion or spray processes.

    PubMed

    Rangel, Thomaz C; Michels, Alexandre F; Horowitz, Flávio; Weibel, Daniel E

    2015-03-24

    Textures that resemble typical fern or bracken plant species (dendrite structures) were fabricated for liquid repellency by dipping copper substrates in a single-step process in solutions containing AgNO3 or by a simple spray liquid application. Superhydrophobic surfaces were produced using a solution containing AgNO3 and trimethoxypropylsilane (TMPSi), and superomniphobic surfaces were produced by a two-step procedure, immersing the copper substrate in a AgNO3 solution and, after that, in a solution containing 1H,1H,2H,2H-perfluorodecyltriethoxysilane (PFDTES). The simple functionalization processes can also be used when the superomniphobic surfaces were destroyed by mechanical stress. By immersion of the wrecked surfaces in the above solutions or by the spray method and soft heating, the copper substrates could be easily repaired, regenerating the surfaces' superrepellency to liquids. The micro- and nanoroughness structures generated on copper surfaces by the deposition of silver dendrites functionalized with TMPSi presented apparent contact angles greater than 150° with a contact angle hysteresis lower than 10° when water was used as the test liquid. To avoid total wettability with very low surface tension liquids, such as rapeseed oil and hexadecane, a thin perfluorinated coating of poly(tetrafluoroethylene) (PTFE), produced by physical vapor deposition, was used. A more efficient perfluorinated coating was obtained when PFDTES was used. The superomniphobic surfaces produced apparent contact angles above 150° with all of the tested liquids, including hexadecane, although the contact angle hysteresis with this liquid was above 10°. The coupling of dendritic structures with TMPSi/PTFE or directly by PFDTES coatings was responsible for the superrepellency of the as-prepared surfaces. These simple, fast, and reliable procedures allow the large area, and cost-effective scale fabrication of superrepellent surfaces on copper substrates for various industrial

  15. Interpretation in Sweden.

    ERIC Educational Resources Information Center

    Hultman, Sven-G.

    1987-01-01

    Describes some of the interpretive developments underway in Sweden. Discusses some programs in both natural and cultural interpretation. Calls for increasing the purpose and content of heritage preservation and conservation to the general public. (TW)

  16. U-interpreter

    SciTech Connect

    Arvind; Gostelow, K.P.

    1982-02-01

    The author argues that by giving a unique name to every activity generated during a computation, the u-interpreter can provide greater concurrency in the interpretation of data flow graphs. 19 references.

  17. Quick Statistics

    MedlinePlus

    ... population, or about 25 million Americans, has experienced tinnitus lasting at least five minutes in the past ... by NIDCD Epidemiology and Statistics Program staff: (1) tinnitus prevalence was obtained from the 2008 National Health ...

  18. Perception in statistical graphics

    NASA Astrophysics Data System (ADS)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  19. Interpreting. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    Darroch, Kathleen

    2010-01-01

    An interpreter's role is to facilitate communication and convey all auditory and signed information so that both hearing and deaf individuals may fully interact. The common types of services provided by interpreters are: (1) American Sign Language (ASL) Interpretation--a visual-gestural language with its own linguistic features; (2) Sign Language…

  20. Interpreting. NETAC Teacher Tipsheet.

    ERIC Educational Resources Information Center

    Darroch, Kathy; Marshall, Liza

    This tipsheet explains that an interpreter's role is to facilitate communication and convey all auditory and signed information so that individuals with and without hearing may fully interact. It outlines the common types of services provided by interpreters, and discusses principles guiding the professional behaviors of interpreters. When working…

  1. Basic statistics in cell biology.

    PubMed

    Vaux, David L

    2014-01-01

    The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind.

  2. Interpreting Abstract Interpretations in Membership Equational Logic

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Rosu, Grigore

    2001-01-01

    We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.

  3. Preparation and Use of an Easily Constructed, Inexpensive Chamber for Viewing Courtship Behaviors of Fruit Flies, Drosophila sp.

    ERIC Educational Resources Information Center

    Christensen, Timothy J.; Labov, Jay B.

    1997-01-01

    Details the construction of a viewing chamber for fruit flies that connects to a dissecting microscope and features a design that enables students to easily move fruit flies in and out of the chamber. (DDR)

  4. Could Psoriatic Arthritis Be Easily Diagnosed from Current Suspicious Physical Findings in the Dermatology Clinic?

    PubMed Central

    Choi, Jee Woong; Kim, Bo Ri; Seo, Eunmi

    2017-01-01

    Background The prevalence and clinical characteristics of psoriatic arthritis (PsA) in patients with psoriasis are not well described in Asian populations, including Koreans. Objective The purpose of this study was to investigate the prevalence of PsA by using the classification of psoriatic arthritis (CASPAR) criteria on the basis of physical examination only, as well as its correlation with psoriasis severity and other medical conditions including nail psoriasis. Methods A single-center, cross-sectional observational cohort study was conducted, and the included patients were evaluated for PsA according to the CASPAR criteria. The psoriasis area severity index (PASI) and the nail psoriasis severity index (NAPSI) were calculated. Results The prevalence of PsA in patients with psoriasis in Korea was 13.5%. When performing logistic regression, hyperlipidemia and localized pustular psoriasis were found to be significant predictors of PsA. The PASI score was significantly higher in PsA patients than in those with psoriasis alone (p=0.014). Psoriatic nail involvement was found in 85.5% of the study population, and all PsA patients had nail psoriasis. The mean NAPSI score was higher in patients with PsA; however, the difference was not statistically significant. Conclusion There was a close relation between psoriasis severity and PsA, although nail psoriasis severity was not related to PsA status. Dermatologists can diagnose PsA from current physical findings by using the CASPAR criteria. To validate the CASPAR criteria for PsA diagnosis, the definition of nail psoriasis clinical types and severity in the CASPAR criteria should be reviewed again. PMID:28223746

  5. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    PubMed

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  6. Interpretation biases in paranoia.

    PubMed

    Savulich, George; Freeman, Daniel; Shergill, Sukhi; Yiend, Jenny

    2015-01-01

    Information in the environment is frequently ambiguous in meaning. Emotional ambiguity, such as the stare of a stranger, or the scream of a child, encompasses possible good or bad emotional consequences. Those with elevated vulnerability to affective disorders tend to interpret such material more negatively than those without, a phenomenon known as "negative interpretation bias." In this study we examined the relationship between vulnerability to psychosis, measured by trait paranoia, and interpretation bias. One set of material permitted broadly positive/negative (valenced) interpretations, while another allowed more or less paranoid interpretations, allowing us to also investigate the content specificity of interpretation biases associated with paranoia. Regression analyses (n=70) revealed that trait paranoia, trait anxiety, and cognitive inflexibility predicted paranoid interpretation bias, whereas trait anxiety and cognitive inflexibility predicted negative interpretation bias. In a group comparison those with high levels of trait paranoia were negatively biased in their interpretations of ambiguous information relative to those with low trait paranoia, and this effect was most pronounced for material directly related to paranoid concerns. Together these data suggest that a negative interpretation bias occurs in those with elevated vulnerability to paranoia, and that this bias may be strongest for material matching paranoid beliefs. We conclude that content-specific biases may be important in the cause and maintenance of paranoid symptoms.

  7. Statistics Revelations

    ERIC Educational Resources Information Center

    Chicot, Katie; Holmes, Hilary

    2012-01-01

    The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…

  8. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden "jewels" in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  9. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden “jewels” in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  10. Misuse of statistics in surgical literature

    PubMed Central

    Ronna, Brenden; Robbins, Riann B.

    2016-01-01

    Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes. PMID:27621909

  11. [Blood proteins in African trypanosomiasis: variations and statistical interpretations].

    PubMed

    Cailliez, M; Poupin, F; Pages, J P; Savel, J

    1982-01-01

    The estimation of blood orosomucoid, haptoglobin, C-reactive protein and immunoglobulins levels, has enable us to prove a specific proteic profile in the human african trypanosomiasis, as compared with other that of parasitic diseases, and with an healthy african reference group. Data processing informatique by principal components analysis, provide a valuable pool for epidemiological surveys.

  12. Interpretation of psychophysics response curves using statistical physics.

    PubMed

    Knani, S; Khalfaoui, M; Hachicha, M A; Mathlouthi, M; Ben Lamine, A

    2014-05-15

    Experimental gustatory curves have been fitted for four sugars (sucrose, fructose, glucose and maltitol), using a double layer adsorption model. Three parameters of the model are fitted, namely the number of molecules per site n, the maximum response RM and the concentration at half saturation C1/2. The behaviours of these parameters are discussed in relationship to each molecule's characteristics. Starting from the double layer adsorption model, we determined (in addition) the adsorption energy of each molecule on taste receptor sites. The use of the threshold expression allowed us to gain information about the adsorption occupation rate of a receptor site which fires a minimal response at a gustatory nerve. Finally, by means of this model we could calculate the configurational entropy of the adsorption system, which can describe the order and disorder of the adsorbent surface.

  13. A Novel Statistical Analysis and Interpretation of Flow Cytometry Data

    DTIC Science & Technology

    2013-07-05

    aCenter for Research in Scientific Computation and Center for Quantitative Sciences in Biomedicine, North Carolina State University, Raleigh, NC 27695-8212...dependent com- partmental model for computing cell numbers in CFSE-based lymphocyte proliferation assays, Math . Biosci. Eng. 9 (2012), pp. 699–736. CRSC-TR12...USA; bICREA Infection Biology Laboratory, Department of Experimental and Health Sciences , Universitat Pompeu Fabra, 08003 Barcelona, Spain (Received

  14. A Novel Statistical Analysis and Interpretation of Flow Cytometry Data

    DTIC Science & Technology

    2013-03-31

    Scientific Computation and Center for Quantitative Sciences in Biomedicine North Carolina State University, Raleigh, NC 27695-8212 Cristina Peligero...Jordi Argilaguet, and Andreas Meyerhans ICREA Infection Biology Lab, Department of Experimental and Health Sciences Universitat Pompeu Fabra, 08003...the fast computational approaches as described in [27]. It is also shown how the new model can be compared with older label-structured models such as

  15. The Statistical Literacy Needed to Interpret School Assessment Data

    ERIC Educational Resources Information Center

    Chick, Helen; Pierce, Robyn

    2013-01-01

    State-wide and national testing in areas such as literacy and numeracy produces reports containing graphs and tables illustrating school and individual performance. These are intended to inform teachers, principals, and education organisations about student and school outcomes, to guide change and improvement. Given the complexity of the…

  16. [Descriptive statistics].

    PubMed

    Rendón-Macías, Mario Enrique; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe

    2016-01-01

    Descriptive statistics is the branch of statistics that gives recommendations on how to summarize clearly and simply research data in tables, figures, charts, or graphs. Before performing a descriptive analysis it is paramount to summarize its goal or goals, and to identify the measurement scales of the different variables recorded in the study. Tables or charts aim to provide timely information on the results of an investigation. The graphs show trends and can be histograms, pie charts, "box and whiskers" plots, line graphs, or scatter plots. Images serve as examples to reinforce concepts or facts. The choice of a chart, graph, or image must be based on the study objectives. Usually it is not recommended to use more than seven in an article, also depending on its length.

  17. Order Statistics and Nonparametric Statistics.

    DTIC Science & Technology

    2014-09-26

    Topics investigated include the following: Probability that a fuze will fire; moving order statistics; distribution theory and properties of the...problem posed by an Army Scientist: A fuze will fire when at least n-i (or n-2) of n detonators function within time span t. What is the probability of

  18. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  19. Higher Education Interpreting.

    ERIC Educational Resources Information Center

    Woll, Bencie; Porcari li Destri, Giulia

    This paper discusses issues related to the training and provision of interpreters for deaf students at institutions of higher education in the United Kingdom. Background information provided notes the increasing numbers of deaf and partially hearing students, the existence of funding to pay for interpreters, and trends in the availability of…

  20. Working with Educational Interpreters.

    ERIC Educational Resources Information Center

    Seal, Brenda C.

    2000-01-01

    This article addresses the mutual needs of speech-language pathologists and educational interpreters in providing services to their students. Guidelines supported by recent research reports and survey data collected from interpreters are offered to speech-language pathologists as ways to improve the working relationship with educational…

  1. Lies or Misuse?: Comment on “Lies, Damned Lies, and Statistics (in Geology)”

    NASA Astrophysics Data System (ADS)

    Tseng, Chih-Yuan; Chen, Chien-Chih

    2011-02-01

    To demonstrate a concern in geological interpretation after statistical hypothesis testing, writing that “geological hypotheses are never ‘true’—they will always be rejected if lots of data are available,” P. Vermeesch (Eos, 90(47), 443, doi:10.1029/2009EO470004, 2009) considers a null hypothesis H0 of earthquake occurrences not depending on the day of the week. He found that his testing result rejects H0, and he argues that the hypothesis testing does not reveal any geological significance. We argue that his conclusion basically demonstrates a Type I statistical error, where the null hypothesis is rejected despite being true. Because the use of hypothesis testing crucially relies on three criteria—the correct null hypothesis, a plausible probability distribution, and an appropriate testing statistic—one will easily obtain an incorrect interpretation of statistical significance if one of these criteria is not met. Vermeesch's argument does not exhaustively address whether the last two criteria are met and is insufficient to claim that statistically the hypothesis should be rejected.

  2. Can rare SAT formulae be easily recognized? On the efficiency of message-passing algorithms for K-SAT at large clause-to-variable ratios

    NASA Astrophysics Data System (ADS)

    Altarelli, Fabrizio; Monasson, Rémi; Zamponi, Francesco

    2007-02-01

    For large clause-to-variable ratios, typical K-SAT instances drawn from the uniform distribution have no solution. We argue, based on statistical mechanics calculations using the replica and cavity methods, that rare satisfiable instances from the uniform distribution are very similar to typical instances drawn from the so-called planted distribution, where instances are chosen uniformly between the ones that admit a given solution. It then follows, from a recent article by Feige, Mossel and Vilenchik (2006 Complete convergence of message passing algorithms for some satisfiability problems Proc. Random 2006 pp 339-50), that these rare instances can be easily recognized (in O(log N) time and with probability close to 1) by a simple message-passing algorithm.

  3. Neural network classification - A Bayesian interpretation

    NASA Technical Reports Server (NTRS)

    Wan, Eric A.

    1990-01-01

    The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical framework.

  4. Theory Interpretations in PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)

    2001-01-01

    The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.

  5. Working With Educational Interpreters.

    PubMed

    Seal, Brenda C

    2000-01-01

    Increasing numbers of students who are deaf or hard of hearing are being educated in their local schools. Accommodations frequently made for these students include the provision of educational interpreting services. Educational interpreters serve to equalize the source language or source communication mode (usually spoken English) with a target language or target mode (either sign language, cued speech, or oral transliterating). Educational interpreters' expertise in sign language or cued speech will likely exceed that of speech-language pathologists, whose expertise in speech and language development and in discourse demands of the classroom will likely exceed that of the educational interpreters. This article addresses the mutual needs of speech-language pathologists and educational interpreters in providing services to their students. Guidelines supported by recent research reports and survey data collected from interpreters are offered to speech-language pathologists as ways to improve the working relationships with educational interpreters in three areas: (a) evaluating a student's communication skills, (b) establishing treatment goals and intervening to meet those goals, and

  6. Double copper sheath multiconductor instrumentation cable is durable and easily installed in high thermal or nuclear radiation area

    NASA Technical Reports Server (NTRS)

    Mc Crae, A. W., Jr.

    1967-01-01

    Multiconductor instrumentation cable in which the conducting wires are routed through two concentric copper tube sheaths, employing a compressed insulator between the conductors and between the inner and outer sheaths, is durable and easily installed in high thermal or nuclear radiation area. The double sheath is a barrier against moisture, abrasion, and vibration.

  7. Smartphones for post-event analysis: a low-cost and easily accessible approach for mapping natural hazards

    NASA Astrophysics Data System (ADS)

    Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo

    2015-04-01

    A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar

  8. An interpretation of the nitrogen deficiency in comets

    NASA Astrophysics Data System (ADS)

    Iro, Nicolas; Gautier, Daniel; Hersant, Franck; Bockelée-Morvan, Dominique; Lunine, Jonathan I.

    2003-02-01

    We propose an interpretation of the composition of volatiles observed in comets based on their trapping in the form of clathrate hydrates in the solar nebula. The formation of clathrates is calculated from the statistical thermodynamics of Lunine and Stevenson (1985, Astrophys. J. Suppl. 58, 493-531), and occurs in an evolutionary turbulent solar nebula described by the model of Hersant et al. (2001, Astrophys. J. 554, 391-407). It is assumed that clathrate hydrates were incorporated into the icy grains that formed cometesimals. The strong depletion of the N 2 molecule with respect to CO observed in some comets is explained by the fact that CO forms clathrate hydrates much more easily than does N 2. The efficiency of this depletion, as well as the amount of trapped CO, depends upon the amount of water ice available in the region where the clathration took place. This might explain the diversity of CO abundances observed in comets. The same theory, applied to the trapping of volatiles around 5 AU, explains the enrichments in Ar, Kr, Xe, C, and N with respect to the solar abundance measured in the deep troposphere of Jupiter (Gautier et al., 2001a,b).

  9. Customizable tool for ecological data entry, assessment, monitoring, and interpretation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...

  10. Large, Easily Deployable Structures

    NASA Technical Reports Server (NTRS)

    Agan, W. E.

    1983-01-01

    Study of concepts for large space structures will interest those designing scaffolding, radio towers, rescue equipment, and prefabricated shelters. Double-fold, double-cell module was selected for further design and for zero gravity testing. Concept is viable for deployment by humans outside space vehicle as well as by remotely operated manipulator.

  11. Modification of codes NUALGAM and BREMRAD. Volume 3: Statistical considerations of the Monte Carlo method

    NASA Technical Reports Server (NTRS)

    Firstenberg, H.

    1971-01-01

    The statistics are considered of the Monte Carlo method relative to the interpretation of the NUGAM2 and NUGAM3 computer code results. A numerical experiment using the NUGAM2 code is presented and the results are statistically interpreted.

  12. Programs for Training Interpreters.

    ERIC Educational Resources Information Center

    American Annals of the Deaf, 2003

    2003-01-01

    This listing provides directory information on U.S. programs for training interpreters for individuals with deafness. Schools are listed by state and include director and degree information. (Author/CR)

  13. Interpreting the X(5568)

    NASA Astrophysics Data System (ADS)

    Burns, T. J.; Swanson, E. S.

    2016-09-01

    A variety of options for interpreting the DØ state, X (5568), are examined. We find that threshold, cusp, molecular, and tetraquark models are all unfavoured. Several experimental tests for unravelling the nature of the signal are suggested.

  14. Interpretation of dental radiographs.

    PubMed

    Woodward, Tony M

    2009-02-01

    Interpretation of dental radiographs is fairly straightforward, with a handful of common patterns making up the majority of pathology. This article covers normal radiographic anatomy, endodontic disease, periodontal disease, neoplastic changes, tooth resorption, caries, and radiographic signs of oral trauma.

  15. Interpretation of Airphotos and Remotely Sensed Imagery

    NASA Astrophysics Data System (ADS)

    Ainsworth, Thomas L.; Jansen, Robert

    With the proliferation of easily accessible remotely sensed imagery over the last several years, image analysts from a wide variety of working environments are in high demand. These analysts do not always have advanced technical backgrounds in science. Robert Arnold's useful and timely laboratory manual serves as an adequate introduction to interpreting remotely sensed photographs and imagery. The book poses a graduated set of examples and questions with a generally increasing but low level of sophistication. It is easy to read, and considerable care has been exercised in the layout of the subject index and overall organization of the manual.

  16. Nationally consistent and easily-implemented approach to evaluate littoral-riparian habitat quality in lakes and reservoirs

    EPA Science Inventory

    The National Lakes Assessment (NLA) and other lake survey and monitoring efforts increasingly rely upon biological assemblage data to define lake condition. Information concerning the multiple dimensions of physical and chemical habitat is necessary to interpret this biological ...

  17. Polysomnography methods and interpretations.

    PubMed

    Rundell, O H; Jones, R K

    1990-08-01

    As the field of sleep disorders medicine continues to mature, appropriate diagnostic techniques are becoming properly defined and standardized. This article focuses principally upon diagnostic testing for sleep apnea, although other sleep disorders are discussed briefly. When interpreting a polysomnogram, one must consider a number of complex variables. A critical discussion of the methods for adequately measuring these variables is provided together with guidelines for appropriate interpretation.

  18. Interpreter-mediated dentistry.

    PubMed

    Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F

    2015-05-01

    The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals.

  19. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  20. Hold My Calls: An Activity for Introducing the Statistical Process

    ERIC Educational Resources Information Center

    Abel, Todd; Poling, Lisa

    2015-01-01

    Working with practicing teachers, this article demonstrates, through the facilitation of a statistical activity, how to introduce and investigate the unique qualities of the statistical process including: formulate a question, collect data, analyze data, and interpret data.

  1. Statistics in fusion experiments

    NASA Astrophysics Data System (ADS)

    McNeill, D. H.

    1997-11-01

    Since the reasons for the variability in data from plasma experiments are often unknown or uncontrollable, statistical methods must be applied. Reliable interpretation and public accountability require full data sets. Two examples of data misrepresentation at PPPL are analyzed: Te >100 eV on S-1 spheromak.(M. Yamada, Nucl. Fusion 25, 1327 (1985); reports to DoE; etc.) The reported high values (statistical artifacts of Thomson scattering measurements) were selected from a mass of data with an average of 40 eV or less. ``Correlated'' spectroscopic data were meaningless. (2) Extrapolation to Q >=0.5 for DT in TFTR.(D. Meade et al., IAEA Baltimore (1990), V. 1, p. 9; H. P. Furth, Statements to U. S. Congress (1989).) The DD yield used there was the highest through 1990 (>= 50% above average) and the DT to DD power ratio used was about twice any published value. Average DD yields and published yield ratios scale to Q<0.15 for DT, in accord with the observed performance over the last 3 1/2 years. Press reports of outlier data from TFTR have obscured the fact that the DT behavior follows from trivial scaling of the DD data. Good practice in future fusion research would have confidence intervals and other descriptive statistics accompanying reported numerical values (cf. JAMA).

  2. Is the statistic value all we should care about in neuroimaging?

    PubMed

    Chen, Gang; Taylor, Paul A; Cox, Robert W

    2017-02-15

    Here we address an important issue that has been embedded within the neuroimaging community for a long time: the absence of effect estimates in results reporting in the literature. The statistic value itself, as a dimensionless measure, does not provide information on the biophysical interpretation of a study, and it certainly does not represent the whole picture of a study. Unfortunately, in contrast to standard practice in most scientific fields, effect (or amplitude) estimates are usually not provided in most results reporting in the current neuroimaging publications and presentations. Possible reasons underlying this general trend include (1) lack of general awareness, (2) software limitations, (3) inaccurate estimation of the BOLD response, and (4) poor modeling due to our relatively limited understanding of FMRI signal components. However, as we discuss here, such reporting damages the reliability and interpretability of the scientific findings themselves, and there is in fact no overwhelming reason for such a practice to persist. In order to promote meaningful interpretation, cross validation, reproducibility, meta and power analyses in neuroimaging, we strongly suggest that, as part of good scientific practice, effect estimates should be reported together with their corresponding statistic values. We provide several easily adaptable recommendations for facilitating this process.

  3. Geological interpretation of a Gemini photo

    USGS Publications Warehouse

    Hemphill, William R.; Danilchik, Walter

    1968-01-01

    Study of the Gemini V photograph of the Salt Range and Potwar Plateau, West Pakistan, indicates that small-scale orbital photographs permit recognition of the regional continuity of some geologic features, particularly faults and folds that could he easily overlooked on conventional air photographs of larger scale. Some stratigraphic relationships can also be recognized on the orbital photograph, but with only minimal previous geologic knowledge of the area, these interpretations are less conclusive or reliable than the interpretation of structure. It is suggested that improved atmospheric penetration could be achieved through the use of color infrared film. Photographic expression of topography could also be improved by deliberately photographing some areas during periods of low sun angle.

  4. Highly concentrated synthesis of copper-zinc-tin-sulfide nanocrystals with easily decomposable capping molecules for printed photovoltaic applications.

    PubMed

    Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho

    2013-11-07

    Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination.

  5. Ta3N5-Pt nonwoven cloth with hierarchical nanopores as efficient and easily recyclable macroscale photocatalysts

    PubMed Central

    Li, Shijie; Zhang, Lisha; Wang, Huanli; Chen, Zhigang; Hu, Junqing; Xu, Kaibing; Liu, Jianshe

    2014-01-01

    Traditional nanosized photocatalysts usually have high photocatalytic activity but can not be efficiently recycled. Film-shaped photocatalysts on the substrates can be easily recycled, but they have low surface area and/or high production cost. To solve these problems, we report on the design and preparation of efficient and easily recyclable macroscale photocatalysts with nanostructure by using Ta3N5 as a model semiconductor. Ta3N5-Pt nonwoven cloth has been prepared by an electrospinning-calcination-nitridation-wet impregnation method, and it is composed of Ta3N5 fibers with diameter of 150–200 nm and hierarchical pores. Furthermore, these fibers are constructed from Ta3N5 nanoparticles with diameter of ~25 nm which are decorated with Pt nanoparticles with diameter of ~2.5 nm. Importantly, Ta3N5-Pt cloth can be used as an efficient and easily recyclable macroscale photocatalyst with wide visible-light response, for the degradation of methylene blue and parachlorophenol, probably resulting in a very promising application as “photocatalyst dam” for the polluted river. PMID:24496147

  6. Ta3N5-Pt nonwoven cloth with hierarchical nanopores as efficient and easily recyclable macroscale photocatalysts

    NASA Astrophysics Data System (ADS)

    Li, Shijie; Zhang, Lisha; Wang, Huanli; Chen, Zhigang; Hu, Junqing; Xu, Kaibing; Liu, Jianshe

    2014-02-01

    Traditional nanosized photocatalysts usually have high photocatalytic activity but can not be efficiently recycled. Film-shaped photocatalysts on the substrates can be easily recycled, but they have low surface area and/or high production cost. To solve these problems, we report on the design and preparation of efficient and easily recyclable macroscale photocatalysts with nanostructure by using Ta3N5 as a model semiconductor. Ta3N5-Pt nonwoven cloth has been prepared by an electrospinning-calcination-nitridation-wet impregnation method, and it is composed of Ta3N5 fibers with diameter of 150-200 nm and hierarchical pores. Furthermore, these fibers are constructed from Ta3N5 nanoparticles with diameter of ~25 nm which are decorated with Pt nanoparticles with diameter of ~2.5 nm. Importantly, Ta3N5-Pt cloth can be used as an efficient and easily recyclable macroscale photocatalyst with wide visible-light response, for the degradation of methylene blue and parachlorophenol, probably resulting in a very promising application as ``photocatalyst dam'' for the polluted river.

  7. Ta3N5-Pt nonwoven cloth with hierarchical nanopores as efficient and easily recyclable macroscale photocatalysts.

    PubMed

    Li, Shijie; Zhang, Lisha; Wang, Huanli; Chen, Zhigang; Hu, Junqing; Xu, Kaibing; Liu, Jianshe

    2014-02-05

    Traditional nanosized photocatalysts usually have high photocatalytic activity but can not be efficiently recycled. Film-shaped photocatalysts on the substrates can be easily recycled, but they have low surface area and/or high production cost. To solve these problems, we report on the design and preparation of efficient and easily recyclable macroscale photocatalysts with nanostructure by using Ta3N5 as a model semiconductor. Ta3N5-Pt nonwoven cloth has been prepared by an electrospinning-calcination-nitridation-wet impregnation method, and it is composed of Ta3N5 fibers with diameter of 150-200 nm and hierarchical pores. Furthermore, these fibers are constructed from Ta3N5 nanoparticles with diameter of ~25 nm which are decorated with Pt nanoparticles with diameter of ~2.5 nm. Importantly, Ta3N5-Pt cloth can be used as an efficient and easily recyclable macroscale photocatalyst with wide visible-light response, for the degradation of methylene blue and parachlorophenol, probably resulting in a very promising application as "photocatalyst dam" for the polluted river.

  8. Considerations When Working with Interpreters.

    ERIC Educational Resources Information Center

    Hwa-Froelich, Deborah A.; Westby, Carol E.

    2003-01-01

    This article describes the current training and certification procedures in place for linguistic interpreters, the continuum of interpreter roles, and how interpreters' perspectives may influence the interpretive interaction. The specific skills needed for interpreting in either health care or educational settings are identified. A table compares…

  9. Stupid statistics!

    PubMed

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  10. Screencast Tutorials Enhance Student Learning of Statistics

    ERIC Educational Resources Information Center

    Lloyd, Steven A.; Robertson, Chuck L.

    2012-01-01

    Although the use of computer-assisted instruction has rapidly increased, there is little empirical research evaluating these technologies, specifically within the context of teaching statistics. The authors assessed the effect of screencast tutorials on learning outcomes, including statistical knowledge, application, and interpretation. Students…

  11. How to spot a statistical problem: advice for a non-statistical reviewer.

    PubMed

    Greenwood, Darren C; Freeman, Jennifer V

    2015-11-02

    Statistical analyses presented in general medical journals are becoming increasingly sophisticated. BMC Medicine relies on subject reviewers to indicate when a statistical review is required. We consider this policy and provide guidance on when to recommend a manuscript for statistical evaluation. Indicators for statistical review include insufficient detail in methods or results, some common statistical issues and interpretation not based on the presented evidence. Reviewers are required to ensure that the manuscript is methodologically sound and clearly written. Within that context, they are expected to provide constructive feedback and opinion on the statistical design, analysis, presentation and interpretation. If reviewers lack the appropriate background to positively confirm the appropriateness of any of the manuscript's statistical aspects, they are encouraged to recommend it for expert statistical review.

  12. Statistical ecology comes of age.

    PubMed

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  13. The ADAMS interactive interpreter

    SciTech Connect

    Rietscha, E.R.

    1990-12-17

    The ADAMS (Advanced DAta Management System) project is exploring next generation database technology. Database management does not follow the usual programming paradigm. Instead, the database dictionary provides an additional name space environment that should be interactively created and tested before writing application code. This document describes the implementation and operation of the ADAMS Interpreter, an interactive interface to the ADAMS data dictionary and runtime system. The Interpreter executes individual statements of the ADAMS Interface Language, providing a fast, interactive mechanism to define and access persistent databases. 5 refs.

  14. Statistical methods to monitor the West Valley off-gas system

    SciTech Connect

    Eggett, D.L.

    1990-10-01

    The off-gas system for the ceramic melter operated at the West Valley Demonstration Project at West Valley, NY, is monitored during melter operation. A one-at-a-time method of monitoring the parameters of the off-gas system is not statistically sound. Therefore, multivariate statistical methods appropriate for the monitoring of many correlated parameters will be used. Monitoring a large number of parameters increases the probability of a false out-of-control signal. If the parameters being monitored are statistically independent, the control limits can be easily adjusted to obtain the desired probability of a false out-of-control signal. However, a high degree of correlation generally exists among the parameters being monitored in the off-gas system. This makes it very difficult to control the probability of false calls (saying the system is out-of-control when it is in-control or saying the system is in-control when it is actually out-of-control). The interpretation of the individual control charts is difficult in the presence of correlation among the variables. When a high degree of correlation exists, variable reduction techniques can be used to reduce the number of parameters. Principal components have been used as a variable reduction technique. The principal component (PC) scores have desirable statistical properties when the original variables are distributed as multivariate normals. Two statistics derived from the PC scores and used to form multivariate control charts are outlined and their distributional properties reviewed. 2 refs., 2 figs.

  15. Plastic Surgery Statistics

    MedlinePlus

    ... PRS GO PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the ... Plastic Surgery Statistics 2005 Plastic Surgery Statistics 2016 Plastic Surgery Statistics Stats Report 2016 National Clearinghouse of ...

  16. Ranald Macdonald and statistical inference.

    PubMed

    Smith, Philip T

    2009-05-01

    Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing.

  17. Linking numbers, spin, and statistics of solitons

    NASA Technical Reports Server (NTRS)

    Wilczek, F.; Zee, A.

    1983-01-01

    The spin and statistics of solitons in the (2 + 1)- and (3 + 1)-dimensional nonlinear sigma models is considered. For the (2 + 1)-dimensional case, there is the possibility of fractional spin and exotic statistics; for 3 + 1 dimensions, the usual spin-statistics relation is demonstrated. The linking-number interpretation of the Hopf invariant and the use of suspension considerably simplify the analysis.

  18. Interpreting & Biomechanics. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    PEPNet-Northeast, 2001

    2001-01-01

    Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…

  19. Psychosemantics and Simultaneous Interpretation.

    ERIC Educational Resources Information Center

    Le Ny, Jean-Francois

    A comprehension model of simultaneous interpretation activity raises three types of problems: structure of semantic information stored in long-term memory, modalities of input processing and specific restrictions due to situation. A useful concept of semantic mnesic structures includes: (1) a componential-predicative lexicon; (2) a propositional…

  20. Interpreting the Constitution.

    ERIC Educational Resources Information Center

    Brennan, William J., Jr.

    1987-01-01

    Discusses constitutional interpretations relating to capital punishment and protection of human dignity. Points out the document's effectiveness in creating a new society by adapting its principles to current problems and needs. Considers two views of the Constitution that lead to controversy over the legitimacy of judicial decisions. (PS)

  1. Fractal interpretation of intermittency

    SciTech Connect

    Hwa, R.C.

    1991-12-01

    Implication of intermittency in high-energy collisions is first discussed. Then follows a description of the fractal interpretation of intermittency. A basic quantity with asymptotic fractal behavior is introduced. It is then shown how the factorial moments and the G moments can be expressed in terms of it. The relationship between the intermittency indices and the fractal indices is made explicit.

  2. Comprehensions and Interpretations.

    ERIC Educational Resources Information Center

    Urquhart, Alexander H.

    1987-01-01

    Argues that second-language reading comprehension and its assessment can be usefully divided into two aspects: (1) comprehensions (different levels of comprehension the reader adopts to suit different purposes of reading); and (2) interpretations (different readings of the same text resulting from different background knowledge or preoccupations…

  3. Interpreting Contradictory Communications.

    ERIC Educational Resources Information Center

    Lightfoot, Cynthia

    Preschool children, elementary school students, and adults participated in a study that examined various processes used to interpret contradictory communications. A screening test determined that all subjects were capable of discriminating between contradictory and congruent communications. Subjects were presented with contradictory verbal-facial…

  4. Deafness and Interpreting.

    ERIC Educational Resources Information Center

    New Jersey State Dept. of Labor, Trenton. Div. of the Deaf.

    This paper explains how the hearing loss of deaf persons affects communication, describes methods deaf individuals use to communicate, and addresses the role of interpreters in the communication process. The volume covers: communication methods such as speechreading or lipreading, written notes, gestures, or sign language (American Sign Language,…

  5. Social Maladjustment: An Interpretation.

    ERIC Educational Resources Information Center

    Center, David B.

    The exclusionary term, "social maladjustment," the definition in Public Law 94-142 (the Education for All Handicapped Children Act) of serious emotional disturbance, has been an enigma for special education. This paper attempts to limit the interpretation of social maladjustment in order to counter effects of such decisions as…

  6. Sorting chromatic sextupoles for easily and effectively correcting second order chromaticity in the Relativistic Heavy Ion Collider

    SciTech Connect

    Luo,Y.; Tepikian, S.; Fischer, W.; Robert-Demolaize, G.; Trbojevic, D.

    2009-01-02

    Based on the contributions of the chromatic sextupole families to the half-integer resonance driving terms, we discuss how to sort the chromatic sextupoles in the arcs of the Relativistic Heavy Ion Collider (RHIC) to easily and effectively correct the second order chromaticities. We propose a method with 4 knobs corresponding to 4 pairs of chromatic sextupole families to online correct the second order chromaticities. Numerical simulation justifies this method, showing that this method reduces the unbalance in the correction strengths of sextupole families and avoids the reversal of sextupole polarities. Therefore, this method yields larger dynamic apertures for the proposed RHIC 2009 100GeV polarized proton run lattices.

  7. Development and validation of a quick easily used biochemical assay for evaluating the viability of small immobile arthropods.

    PubMed

    Phillips, Craig B; Iline, Ilia I; Richards, Nicola K; Novoselov, Max; McNeill, Mark R

    2013-10-01

    Quickly, accurately, and easily assessing the efficacy of treatments to control sessile arthropods (e.g., scale insects) and stationary immature life stages (e.g., eggs and pupae) is problematic because it is difficult to tell whether treated organisms are alive or dead. Current approaches usually involve either maintaining organisms in the laboratory to observe them for development, gauging their response to physical stimulation, or assessing morphological characters such as turgidity and color. These can be slow, technically difficult, or subjective, and the validity of methods other than laboratory rearing has seldom been tested. Here, we describe development and validation of a quick easily used biochemical colorimetric assay for measuring the viability of arthropods that is sufficiently sensitive to test even very small organisms such as white fly eggs. The assay was adapted from a technique for staining the enzyme hexokinase to signal the presence of adenosine triphosphate in viable specimens by reducing a tetrazolium salt to formazan. Basic laboratory facilities and skills are required for production of the stain, but no specialist equipment, expertise, or facilities are needed for its use.

  8. Evaluation of Psychotherapeutic Interpretations

    PubMed Central

    POGGE, DAVID L.; DOUGHER, MICHAEL J.

    1992-01-01

    If much psychotherapy literature goes unread and unused by therapists, one reason may be the apparent irrelevance of theory-derived hypotheses to actual practice. Methods that uncover tacit knowledge that practicing therapists already possess can provide the empirical basis for more relevant theories and the testing of more meaningful hypotheses. This study demonstrates application of the phenomenological method to the question of evaluating psychotherapy. To discover how experienced psychotherapists evaluate interpretations made in actual psychotherapy sessions, therapists were asked to evaluate such interpretations from videotapes; analysis of responses yielded a set of 10 dimensions of evaluation. Such methods offer both practical utility and a source of theoretical growth anchored in the real world of the practicing therapist. PMID:22700101

  9. Tips for Mental Health Interpretation

    ERIC Educational Resources Information Center

    Whitsett, Margaret

    2008-01-01

    This paper offers tips for working with interpreters in mental health settings. These tips include: (1) Using trained interpreters, not bilingual staff or community members; (2) Explaining "interpreting procedures" to the providers and clients; (3) Addressing the stigma associated with mental health that may influence interpreters; (4) Defining…

  10. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on.

  11. Interpretation and creationism.

    PubMed

    Ahumada, J L

    1994-08-01

    This paper is an attempt to raise questions about certain underlying and implicit assumptions in some hermeneutic and narrative approaches to psychoanalysis. Starting from the view that Freud saw interpretation in the clinical setting as an attempt to unveil the analysand's psychic reality, it is argued that he envisaged that psychoanalysis aims to interpret what is real in the analysand's inner world--an empirical line of thought underpinned by the idea of analytic neutrality and an emphasis on the analysand's capacity to judge reality. By contrast, the tendency within the hermeneutic-narrative tradition is to demote psychic reality in favour of an emphasis on the analyst's capacity to interpret in order to help his analysand construct meaning. This approach may be said to put the analyst's words in the place of those of the Creator; in other words, it amounts to a 'verbal creationism', which the author argues is rooted in the idealistic philosophy of Hegel, Vico and Descartes and, further back, can be traced to the Book of Genesis--a conclusion causing the author to express some reservations.

  12. Statistical modeling of SAR images: a survey.

    PubMed

    Gao, Gui

    2010-01-01

    Statistical modeling is essential to SAR (Synthetic Aperture Radar) image interpretation. It aims to describe SAR images through statistical methods and reveal the characteristics of these images. Moreover, statistical modeling can provide a technical support for a comprehensive understanding of terrain scattering mechanism, which helps to develop algorithms for effective image interpretation and creditable image simulation. Numerous statistical models have been developed to describe SAR image data, and the purpose of this paper is to categorize and evaluate these models. We first summarize the development history and the current researching state of statistical modeling, then different SAR image models developed from the product model are mainly discussed in detail. Relevant issues are also discussed. Several promising directions for future research are concluded at last.

  13. The insignificance of statistical significance testing

    USGS Publications Warehouse

    Johnson, Douglas H.

    1999-01-01

    Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.

  14. Improved chitosan-mediated gene delivery based on easily dissociated chitosan polyplexes of highly defined chitosan oligomers.

    PubMed

    Köping-Höggård, M; Vårum, K M; Issa, M; Danielsen, S; Christensen, B E; Stokke, B T; Artursson, P

    2004-10-01

    Nonviral gene delivery systems based on conventional high-molecular-weight chitosans are efficient after lung administration in vivo, but have poor physical properties such as aggregated shapes, low solubility at neutral pH, high viscosity at concentrations used for in vivo delivery and a slow dissociation and release of plasmid DNA, resulting in a slow onset of action. We therefore developed highly effective nonviral gene delivery systems with improved physical properties from a series of chitosan oligomers, ranging in molecular weight from 1.2 to 10 kDa. First, we established structure-property relationships with regard to polyplex formation and in vivo efficiency after lung administration to mice. In a second step, we isolated chitosan oligomers from a preferred oligomer fraction to obtain fractions, ranging from 10 to 50-mers, of more homogeneous size distributions with polydispersities ranging from 1.01 to 1.09. Polyplexes based on chitosan oligomers dissociated more easily than those of a high-molecular-weight ultrapure chitosan (UPC, approximately a 1000-mer), and released pDNA in the presence of anionic heparin. The more easily dissociated polyplexes mediated a faster onset of action and gave a higher gene expression both in 293 cells in vitro and after lung administration in vivo as compared to the more stable UPC polyplexes. Already 24 h after intratracheal administration, a 120- to 260-fold higher luciferase gene expression was observed compared to UPC in the mouse lung in vivo. The gene expression in the lung was comparable to that of PEI (respective AUCs of 2756+/-710 and 3320+/-871 pg luciferase x days/mg of total lung protein). In conclusion, a major improvement of chitosan-mediated nonviral gene delivery to the lung was obtained by using polyplexes of well-defined chitosan oligomers. Polyplexes of oligomer fractions also had superior physicochemical properties to commonly used high-molecular-weight UPC.

  15. Can retrohepatic tunnel be quickly and easily established for laparoscopic liver hanging maneuver by Goldfinger dissector in laparoscopic right hepatectomy* #

    PubMed Central

    Cai, Liu-xin; Wei, Fang-qiang; Yu, Yi-chen; Cai, Xiu-jun

    2016-01-01

    Objective: The liver hanging maneuver (LHM) is rarely applied in laparoscopic right hepatectomy (LRH) because of the difficulty encountered in retrohepatic tunnel (RT) dissection and tape positioning. Thus far no report has detailed how to quickly and easily establish RT for laparoscopic LHM in LRH, nor has employment of the Goldfinger dissector to create a total RT been reported. This study’s aim was to evaluate the safety and feasibility of establishing RT for laparoscopic LHM using the Goldfinger dissector in LRH. Methods: Between March 2015 and July 2015, five consecutive patients underwent LRH via the caudal approach with laparoscopic LHM. A five-step strategy using the Goldfinger dissector to establish RT for laparoscopic LHM was adopted. Perioperative data were analyzed. Results: The median age of patients was 58 (range, 51–65) years. Surgery was performed for one intrahepatic lithiasis and four hepatocellular carcinomas with a median size of 90 (40–150) mm. The median operative time was 320 (282–358) min with a median blood loss of 200 (200–600) ml. Laparoscopic LHM was achieved in a median of 31 (21–62) min, and the median postoperative hospital stay was 14 (9–16) d. No transfusion or conversion was required, and no severe liver-related morbidity or death was observed. Conclusions: The Goldfinger dissector is a useful instrument for the establishment of RT. A five-step strategy using the Goldfinger dissector can quickly and easily facilitate an RT for a laparoscopic LHM in LRH. PMID:27604863

  16. MQSA National Statistics

    MedlinePlus

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...

  17. Data Interpretation: Using Probability

    ERIC Educational Resources Information Center

    Drummond, Gordon B.; Vowler, Sarah L.

    2011-01-01

    Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…

  18. Formalism and Interpretation in Quantum Theory

    NASA Astrophysics Data System (ADS)

    Wilce, Alexander

    2010-04-01

    Quantum Mechanics can be viewed as a linear dynamical theory having a familiar mathematical framework but a mysterious probabilistic interpretation, or as a probabilistic theory having a familiar interpretation but a mysterious formal framework. These points of view are usually taken to be somewhat in tension with one another. The first has generated a vast literature aiming at a “realistic” and “collapse-free” interpretation of quantum mechanics that will account for its statistical predictions. The second has generated an at least equally large literature aiming to derive, or at any rate motivate, the formal structure of quantum theory in probabilistically intelligible terms. In this paper I explore, in a preliminary way, the possibility that these two programmes have something to offer one another. In particular, I show that a version of the measurement problem occurs in essentially any non-classical probabilistic theory, and ask to what extent various interpretations of quantum mechanics continue to make sense in such a general setting. I make a start on answering this question in the case of a rudimentary version of the Everett interpretation.

  19. Cancer Survival: An Overview of Measures, Uses, and Interpretation

    PubMed Central

    Noone, Anne-Michelle; Howlader, Nadia; Cho, Hyunsoon; Keel, Gretchen E.; Garshell, Jessica; Woloshin, Steven; Schwartz, Lisa M.

    2014-01-01

    Survival statistics are of great interest to patients, clinicians, researchers, and policy makers. Although seemingly simple, survival can be confusing: there are many different survival measures with a plethora of names and statistical methods developed to answer different questions. This paper aims to describe and disseminate different survival measures and their interpretation in less technical language. In addition, we introduce templates to summarize cancer survival statistic organized by their specific purpose: research and policy versus prognosis and clinical decision making. PMID:25417231

  20. Enhancing the Teaching of Statistics: Portfolio Theory, an Application of Statistics in Finance

    ERIC Educational Resources Information Center

    Christou, Nicolas

    2008-01-01

    In this paper we present an application of statistics using real stock market data. Most, if not all, students have some familiarity with the stock market (or at least they have heard about it) and therefore can understand the problem easily. It is the real data analysis that students find interesting. Here we explore the building of efficient…

  1. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    SciTech Connect

    Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  2. Interpreting Transistor Noise

    NASA Astrophysics Data System (ADS)

    Pospieszalski, M. W.

    2010-10-01

    The simple noise models of field effect and bipolar transistors reviewed in this article are quite useful in engineering practice, as illustrated by measured and modeled results. The exact and approximate expressions for the noise parameters of FETs and bipolar transistors reveal certain common noise properties and some general noise properties of both devices. The usefulness of these expressions in interpreting the dependence of measured noise parameters on frequency, bias, and temperature and, consequently, in checking of consistency of measured data has been demonstrated.

  3. Interpretation of the geoid

    NASA Technical Reports Server (NTRS)

    Runcorn, S. K.

    1985-01-01

    The superposition of the first satellite geoid determined by Iszak upon Ootilla's geoid was based on surface gravity determinations. Good agreement was observed except over the Pacific area of the globe. The poor agreement over the Pacific was interpreted as the result of inadequate observations there. Many geoids were determined from satellite observations, including Doppler measurements. It is found that the geoid is the result of density differences in the mantle maintained since the primeval Earth by its finite strength. Various models based on this assumption are developed.

  4. How to use and interpret hormone ratios.

    PubMed

    Sollberger, Silja; Ehlert, Ulrike

    2016-01-01

    Hormone ratios have become increasingly popular throughout the neuroendocrine literature since they offer a straightforward way to simultaneously analyze the effects of two interdependent hormones. However, the analysis of ratios is associated with statistical and interpretational concerns which have not been sufficiently considered in the context of endocrine research. The aim of this article, therefore, is to demonstrate and discuss these issues, and to suggest suitable ways to address them. In a first step, we use exemplary testosterone and cortisol data to illustrate that one major concern of ratios lies in their distribution and inherent asymmetry. As a consequence, results of parametric statistical analyses are affected by the ultimately arbitrary decision of which way around the ratio is computed (i.e., A/B or B/A). We suggest the use of non-parametric methods as well as the log-transformation of hormone ratios as appropriate methods to deal with these statistical problems. However, in a second step, we also discuss the complicated interpretation of ratios, and propose moderation analysis as an alternative and oftentimes more insightful approach to ratio analysis. In conclusion, we suggest that researchers carefully consider which statistical approach is best suited to investigate reciprocal hormone effects. With regard to the hormone ratio method, further research is needed to specify what exactly this index reflects on the biological level and in which cases it is a meaningful variable to analyze.

  5. Structural interpretation of seismic data and inherent uncertainties

    NASA Astrophysics Data System (ADS)

    Bond, Clare

    2013-04-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with

  6. [The telephone interpreter. A good alternative to the traditional interpreter].

    PubMed

    Karlsen, W B; Haabeth, A L

    1998-01-20

    In the health care system, good communication is of vital importance for correct information, diagnosis and treatment. When the health care worker and the patient have no common language, an interpreter is needed. In a small population the patient and the interpreter will often be acquainted. In small, local communities few professional interpreters are available. These communities can be served by offering a telephone interpreter, thus providing foreign citizens with a better health service. The presence of an interpreter reduces the possibility of being anonymous. The patient may withhold important information, or give incorrect information. By using a telephone interpreter, neither the patient nor the interpreter needs to know who the other person is. We found this to be a very good alternative in most cases, and sometimes a better solution. A good, loud-speaking telephone was needed. The interpreters were not as satisfied as the doctors and patients. Further development of the service is therefore required.

  7. An intentional interpretive perspective

    PubMed Central

    Neuman, Paul

    2004-01-01

    To the extent that the concept of intention has been addressed within behavior analysis, descriptions of intention have been general and have not specifically included important distinctions that differentiate a behavior-analytic approach from vernacular definitions of intention. A fundamental difference between a behavior-analytic approach and most other psychological approaches is that other approaches focus on the necessity of intentions to explain behavior, whereas a behavior-analytic approach is directed at understanding the interplay between behavior and environment. Behavior-analytic interpretations include the relations between the observer's behavior and the environment. From a behavior-analytic perspective, an analysis of the observer's interpretations of an individual's behavior is inherent in the subsequent attribution of intention. The present agenda is to provide a behavior-analytic account of attributing intention that identifies the establishing conditions for speaking of intention. Also addressed is the extent to which we speak of intentions when the observed individual's behavior is contingency shaped or under instructional control. PMID:22478417

  8. Physical interpretation of antigravity

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; James, Albin

    2016-02-01

    Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl-invariant Standard Model coupled to general relativity (SM +GR ), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated, but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and out signals that interact with the antigravity region. This is no different from a spacetime black box for which the information about its interior is encoded in scattering amplitudes for in/out states at its exterior. Through examples we show that negative kinetic energy in antigravity presents no problems of principles but is an interesting topic for physical investigations of fundamental significance.

  9. Multicomponent statistical analysis to identify flow and transport processes in a highly-complex environment

    NASA Astrophysics Data System (ADS)

    Moeck, Christian; Radny, Dirk; Borer, Paul; Rothardt, Judith; Auckenthaler, Adrian; Berg, Michael; Schirmer, Mario

    2016-11-01

    A combined approach of multivariate statistical analysis, namely factor analysis (FA) and hierarchical cluster analysis (HCA), interpretation of geochemical processes, stable water isotope data and organic micropollutants enabling to assess spatial patterns of water types was performed for a study area in Switzerland, where drinking water production is close to different potential input pathways for contamination. To avoid drinking water contamination, artificial groundwater recharge with surface water into an aquifer is used to create a hydraulic barrier between potential intake pathways for contamination and drinking water extraction wells. Inter-aquifer mixing in the subsurface is identified, where a high amount of artificial infiltrated surface water is mixed with a lesser amount of water originating from the regional flow pathway in the vicinity of drinking water extraction wells. The spatial distribution of different water types can be estimated and a conceptual system understanding is developed. Results of the multivariate statistical analysis are comparable with gained information from isotopic data and organic micropollutants analyses. The integrated approach using different kinds of observations can be easily transferred to a variety of hydrological settings to synthesise and evaluate large hydrochemical datasets. The combination of additional data with different information content is conceivable and enabled effective interpretation of hydrological processes. Using the applied approach leads to more sound conceptual system understanding acting as the very basis to develop improved water resources management practices in a sustainable way.

  10. Reverse Causation and the Transactional Interpretation of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Cramer, John G.

    2006-10-01

    In the first part of the paper we present the transactional interpretation of quantum mechanics, a method of viewing the formalism of quantum mechanics that provides a way of visualizing quantum events and experiments. In the second part, we present an EPR gedankenexperiment that appears to lead to observer-level reverse causation. A transactional analysis of the experiment is presented. It easily accounts for the reported observations but does not reveal any barriers to its modification for reverse causation.

  11. Monitoring and interpreting bioremediation effectiveness

    SciTech Connect

    Bragg, J.R.; Prince, R.C.; Harner, J.; Atlas, R.M.

    1993-12-31

    Following the Exxon Valdez oil spill in 1989, extensive research was conducted by the US Environments Protection Agency and Exxon to develop and implement bioremediation techniques for oil spill cleanup. A key challenge of this program was to develop effective methods for monitoring and interpreting bioremediation effectiveness on extremely heterogenous intertidal shorelines. Fertilizers were applied to shorelines at concentrations known to be safe, and effectiveness achieved in acceleration biodegradation of oil residues was measure using several techniques. This paper describes the most definitive method identified, which monitors biodegradation loss by measuring changes in ratios of hydrocarbons to hopane, a cycloalkane present in the oil that showed no measurable degradation. Rates of loss measured by the hopane ratio method have high levels of statistical confidence, and show that the fertilizer addition stimulated biodegradation rates as much a fivefold. Multiple regression analyses of data show that fertilizer addition of nitrogen in interstitial pore water per unit of oil load was the most important parameter affecting biodegradation rate, and results suggest that monitoring nitrogen concentrations in the subsurface pore water is preferred technique for determining fertilizer dosage and reapplication frequency.

  12. Statistics Poker: Reinforcing Basic Statistical Concepts

    ERIC Educational Resources Information Center

    Leech, Nancy L.

    2008-01-01

    Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…

  13. Predict! Teaching Statistics Using Informational Statistical Inference

    ERIC Educational Resources Information Center

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  14. Changing Preservice Science Teachers' Views of Nature of Science: Why Some Conceptions May be More Easily Altered than Others

    NASA Astrophysics Data System (ADS)

    Mesci, Gunkut; Schwartz, Renee'S.

    2016-02-01

    The purpose of this study was to assess preservice teachers' views of Nature of Science (NOS), identify aspects that were challenging for conceptual change, and explore reasons why. This study particularly focused on why and how some concepts of NOS may be more easily altered than others. Fourteen preservice science teachers enrolled in a NOS and Science Inquiry course participated in this study. Data were collected by using a pre/post format with the Views of Nature of Science questionnaire (VNOS-270), the Views of Scientific Inquiry questionnaire (VOSI-270), follow-up interviews, and classroom artifacts. The results indicated that most students initially held naïve views about certain aspects of NOS like tentativeness and subjectivity. By the end of the semester, almost all students dramatically improved their understanding about almost all aspects of NOS. However, several students still struggled with certain aspects like the differences between scientific theory and law, tentativeness, and socio-cultural embeddedness. Results suggested that instructional, motivational, and socio-cultural factors may influence if and how students changed their views about targeted NOS aspects. Students thought that classroom activities, discussions, and readings were most helpful to improve their views about NOS. The findings from the research have the potential to translate as practical advice for teachers, science educators, and future researchers.

  15. Easily accessible polymer additives for tuning the crystal-growth of perovskite thin-films for highly efficient solar cells.

    PubMed

    Dong, Qingqing; Wang, Zhaowei; Zhang, Kaicheng; Yu, Hao; Huang, Peng; Liu, Xiaodong; Zhou, Yi; Chen, Ning; Song, Bo

    2016-03-14

    For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs.

  16. Easily Accessible Thermotropic Hydrogen‐Bonded Columnar Discotic Liquid Crystals from Fatty Acid– Tris‐Benzoimidazolyl Benzene Complexes

    PubMed Central

    Lugger, Jody A. M.

    2016-01-01

    Abstract We report the formation of easily accessible hydrogen‐bonded columnar discotic liquid crystals (LCs) based on tris‐benzoimidazolyl benzene (TBIB) and commercially available fatty acids. By increasing the length of the fatty acid, the temperature range of liquid crystallinity was tuned. Introducing double bonds in octadecanoic acid lowered the crystallization temperature and increased the temperature range of the mesophase. Surprisingly, dimerized linoleic acid also forms an LC phase. When using branched aliphatic acids with the branching point close to the acid moiety, the mesophase was lost, whereas phosphonic acid or benzenesulfonic acid derivatives did have a mesophase, showing that the generality of this approach extends beyond carboxylic acids as the hydrogen‐bond donor. Furthermore, a polymerizable LC phase was obtained from mixtures of TBIB with a methacrylate‐bearing fatty acid, providing an approach for the fabrication of nanoporous polymer films if the methacrylate groups are polymerized. Finally, the higher solubility of methyl‐TBIB was used to suppress phase separation in stoichiometric mixtures of the template molecule with fatty acids. PMID:28032028

  17. Asynchronous interpretation of parallel microprograms

    SciTech Connect

    Bandman, O.L.

    1984-03-01

    In this article, the authors demonstrate how to pass from a given synchronous interpretation of a parallel microprogram to an equivalent asynchronous interpretation, and investigate the cost associated with the rejection of external synchronization in parallel microprogram structures.

  18. What Language Do Interpreters Speak?

    ERIC Educational Resources Information Center

    Parks, Gerald B.

    1982-01-01

    States that both the register and variety of an interpreter's speech are quite limited and analyzes the linguistic characteristics of "International English," the English used by interpreters at international conferences. (CFM)

  19. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  20. Discussion of "interpretation and play".

    PubMed

    Pick, Irma Brenman

    2011-01-01

    This discussion addresses the conflict in technique between play versus interpretation. It further considers how the nature of the interpretation may be affected by a consideration of what is being projected into the analyst.

  1. 3-D Seismic Interpretation

    NASA Astrophysics Data System (ADS)

    Moore, Gregory F.

    2009-05-01

    This volume is a brief introduction aimed at those who wish to gain a basic and relatively quick understanding of the interpretation of three-dimensional (3-D) seismic reflection data. The book is well written, clearly illustrated, and easy to follow. Enough elementary mathematics are presented for a basic understanding of seismic methods, but more complex mathematical derivations are avoided. References are listed for readers interested in more advanced explanations. After a brief introduction, the book logically begins with a succinct chapter on modern 3-D seismic data acquisition and processing. Standard 3-D acquisition methods are presented, and an appendix expands on more recent acquisition techniques, such as multiple-azimuth and wide-azimuth acquisition. Although this chapter covers the basics of standard time processing quite well, there is only a single sentence about prestack depth imaging, and anisotropic processing is not mentioned at all, even though both techniques are now becoming standard.

  2. The Interhospital Interpreter Project.

    PubMed

    Wlodarczyk, K

    1998-05-01

    My family immigrated to Canada when I was a child. Soon after we arrived, I became ill with acute appendicitis. For days, I lay in pain while my parents searched desperately for a physician. But in the small Ontario town where we had moved, there were no cultural interpreters. My parents faced not just a language barrier, but the barrier of a lack of knowledge of the health care system. They were unaware that an ill person could seek treatment in the emergency department of a general hospital. My parents finally found someone who could help, and I was treated just hours before my appendix would have ruptured. I will never forget those days of pain and fear.

  3. Interpretation of galaxy counts

    SciTech Connect

    Tinsely, B.M.

    1980-10-01

    New models are presented for the interpretation of recent counts of galaxies to 24th magnitude, and predictions are shown to 28th magnitude for future comparison with data from the Space Telescope. The results supersede earlier, more schematic models by the author. Tyson and Jarvis found in their counts a ''local'' density enhancement at 17th magnitude, on comparison with the earlier models; the excess is no longer significant when a more realistic mixture of galaxy colors is used. Bruzual and Kron's conclusion that Kron's counts show evidence for evolution at faint magnitudes is confirmed, and it is predicted that some 23d magnitude galaxies have redshifts greater than unity. These may include spheroidal systems, elliptical galaxies, and the bulges of early-type spirals and S0's, seen during their primeval rapid star formation.

  4. Interpreting bruises at necropsy

    PubMed Central

    Vanezis, P

    2001-01-01

    The accurate interpretation of bruising at necropsy is essential to understanding how a victim has been injured and assists the pathologist in a reliable reconstruction of the events leading to death. It is essential not only to assess the mechanism of production of a bruise, taking into account the type of impacting surface and the magnitude of force used, but also to estimate when the injury was caused. An account is given of the various methods used in the examination of bruises, particularly with respect to aging, as well as the factors that may affect their appearance. Differentiation from artefacts resulting from postmortem changes is also discussed in some detail. Key Words: bruising • necropsy • time of death • cause of death PMID:11328832

  5. A History of Oral Interpretation.

    ERIC Educational Resources Information Center

    Bahn, Eugene; Bahn, Margaret L.

    This historical account of the oral interpretation of literature establishes a chain of events comprehending 25 centuries of verbal tradition from the Homeric Age through 20th Century America. It deals in each era with the viewpoints and contributions of major historical figures to oral interpretation, as well as with oral interpretation's…

  6. Ohio Guidelines for Educational Interpreters.

    ERIC Educational Resources Information Center

    Ohio State Dept. of Education, Columbus. Div. of Special Education.

    This document presents Ohio's state guidelines to assist school districts in providing appropriate educational interpreting services for students who have hearing impairments. A section on the primary role of the educational interpreter considers: necessary knowledge and skills, modes of communication, interpreting environments, testing…

  7. Input of easily available organic C and N stimulates microbial decomposition of soil organic matter in arctic permafrost soil.

    PubMed

    Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J Eloy; Barsukov, Pavel; Bárta, Jiří; Capek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Santrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas

    2014-08-01

    Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM ("priming effect"). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze-thaw processes) to additions of (13)C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant

  8. Synchronized drumming enhances activity in the caudate and facilitates prosocial commitment--if the rhythm comes easily.

    PubMed

    Kokal, Idil; Engel, Annerose; Kirschner, Sebastian; Keysers, Christian

    2011-01-01

    Why does chanting, drumming or dancing together make people feel united? Here we investigate the neural mechanisms underlying interpersonal synchrony and its subsequent effects on prosocial behavior among synchronized individuals. We hypothesized that areas of the brain associated with the processing of reward would be active when individuals experience synchrony during drumming, and that these reward signals would increase prosocial behavior toward this synchronous drum partner. 18 female non-musicians were scanned with functional magnetic resonance imaging while they drummed a rhythm, in alternating blocks, with two different experimenters: one drumming in-synchrony and the other out-of-synchrony relative to the participant. In the last scanning part, which served as the experimental manipulation for the following prosocial behavioral test, one of the experimenters drummed with one half of the participants in-synchrony and with the other out-of-synchrony. After scanning, this experimenter "accidentally" dropped eight pencils, and the number of pencils collected by the participants was used as a measure of prosocial commitment. Results revealed that participants who mastered the novel rhythm easily before scanning showed increased activity in the caudate during synchronous drumming. The same area also responded to monetary reward in a localizer task with the same participants. The activity in the caudate during experiencing synchronous drumming also predicted the number of pencils the participants later collected to help the synchronous experimenter of the manipulation run. In addition, participants collected more pencils to help the experimenter when she had drummed in-synchrony than out-of-synchrony during the manipulation run. By showing an overlap in activated areas during synchronized drumming and monetary reward, our findings suggest that interpersonal synchrony is related to the brain's reward system.

  9. Transport of sewage molecular markers through saturated soil column and effect of easily biodegradable primary substrate on their removal.

    PubMed

    Foolad, Mahsa; Ong, Say Leong; Hu, Jiangyong

    2015-11-01

    Pharmaceutical and personal care products (PPCPs) and artificial sweeteners (ASs) are emerging organic contaminants (EOCs) in the aquatic environment. The presence of PPCPs and ASs in water bodies has an ecologic potential risk and health concern. Therefore, it is needed to detect the pollution sources by understanding the transport behavior of sewage molecular markers in a subsurface area. The aim of this study was to evaluate transport of nine selected molecular markers through saturated soil column experiments. The selected sewage molecular markers in this study were six PPCPs including acetaminophen (ACT), carbamazepine (CBZ), caffeine (CF), crotamiton (CTMT), diethyltoluamide (DEET), salicylic acid (SA) and three ASs including acesulfame (ACF), cyclamate (CYC), and saccharine (SAC). Results confirmed that ACF, CBZ, CTMT, CYC and SAC were suitable to be used as sewage molecular markers since they were almost stable against sorption and biodegradation process during soil column experiments. In contrast, transport of ACT, CF and DEET were limited by both sorption and biodegradation processes and 100% removal efficiency was achieved in the biotic column. Moreover, in this study the effect of different acetate concentration (0-100mg/L) as an easily biodegradable primary substrate on a removal of PPCPs and ASs was also studied. Results showed a negative correlation (r(2)>0.75) between the removal of some selected sewage chemical markers including ACF, CF, ACT, CYC, SAC and acetate concentration. CTMT also decreased with the addition of acetate, but increasing acetate concentration did not affect on its removal. CBZ and DEET removal were not dependent on the presence of acetate.

  10. Input of easily available organic C and N stimulates microbial decomposition of soil organic matter in arctic permafrost soil

    PubMed Central

    Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J. Eloy; Barsukov, Pavel; Bárta, Jiří; Čapek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Šantrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas

    2014-01-01

    Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM (“priming effect”). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze–thaw processes) to additions of 13C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased

  11. Applications of Statistical Tests in Hand Surgery

    PubMed Central

    Song, Jae W.; Haas, Ann; Chung, Kevin C.

    2015-01-01

    During the nineteenth century, with the emergence of public health as a goal to improve hygiene and conditions of the poor, statistics established itself as a distinct scientific field important for critically interpreting studies of public health concerns. During the twentieth century, statistics began to evolve mathematically and methodologically with hypothesis testing and experimental design. Today, the design of medical experiments centers around clinical trials and observational studies, and with the use of statistics, the collected data are summarized, weighed, and presented to direct both physicians and the public towards Evidence-Based Medicine. Having a basic understanding of statistics is mandatory in evaluating the validity of published literature and applying it to patient care. In this review, we aim to apply a practical approach in discussing basic statistical tests by providing a guide to choosing the correct statistical test along with examples relevant to hand surgery research. PMID:19969193

  12. Role exchange in medical interpretation.

    PubMed

    White, Kari; Laws, M Barton

    2009-12-01

    Prior research has documented that medical interpreters engage in non-conduit roles during medical visits. However, agreement on the appropriateness of these roles and their impact on the medical encounter have not yet been achieved. The purpose of this study was to identify non-conduit behavior (role exchange), elucidate the various forms it takes among different types of interpreters, and assess its potential to affect clinical encounters. Using audiotapes from 13 pediatric outpatient visits, we found that "chance" and uncertified hospital interpreters engaged in role exchange by assuming the provider's role; the patient's role; and taking other non-interpretive roles such as socializing with mothers or acting in one's alternate professional role. These behaviors occurred frequently among both types of interpreters while the provider was actively engaged in conducting the medical visit. In most instances, the interpreter did not make his or her behavior transparent to either the provider or the mother. Implications for interpreter and provider training are discussed.

  13. Statistical Mechanics of Combinatorial Auctions

    NASA Astrophysics Data System (ADS)

    Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo

    2006-09-01

    Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.

  14. Neuroendocrine Tumor: Statistics

    MedlinePlus

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the body. It is important to remember that statistics on how many people survive this type of ...

  15. Adrenal Gland Tumors: Statistics

    MedlinePlus

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  16. PROBABILITY AND STATISTICS.

    DTIC Science & Technology

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  17. Interpretation and display of research results

    PubMed Central

    Kulkarni, Dilip Kumar

    2016-01-01

    It important to properly collect, code, clean and edit the data before interpreting and displaying the research results. Computers play a major role in different phases of research starting from conceptual, design and planning, data collection, data analysis and research publication phases. The main objective of data display is to summarize the characteristics of a data and to make the data more comprehensible and meaningful. Usually data is presented depending upon the type of data in different tables and graphs. This will enable not only to understand the data behaviour, but also useful in choosing the different statistical tests to be applied. PMID:27729693

  18. Exploring the Relationship Between Eye Movements and Electrocardiogram Interpretation Accuracy

    NASA Astrophysics Data System (ADS)

    Davies, Alan; Brown, Gavin; Vigo, Markel; Harper, Simon; Horseman, Laura; Splendiani, Bruno; Hill, Elspeth; Jay, Caroline

    2016-12-01

    Interpretation of electrocardiograms (ECGs) is a complex task involving visual inspection. This paper aims to improve understanding of how practitioners perceive ECGs, and determine whether visual behaviour can indicate differences in interpretation accuracy. A group of healthcare practitioners (n = 31) who interpret ECGs as part of their clinical role were shown 11 commonly encountered ECGs on a computer screen. The participants’ eye movement data were recorded as they viewed the ECGs and attempted interpretation. The Jensen-Shannon distance was computed for the distance between two Markov chains, constructed from the transition matrices (visual shifts from and to ECG leads) of the correct and incorrect interpretation groups for each ECG. A permutation test was then used to compare this distance against 10,000 randomly shuffled groups made up of the same participants. The results demonstrated a statistically significant (α  0.05) result in 5 of the 11 stimuli demonstrating that the gaze shift between the ECG leads is different between the groups making correct and incorrect interpretations and therefore a factor in interpretation accuracy. The results shed further light on the relationship between visual behaviour and ECG interpretation accuracy, providing information that can be used to improve both human and automated interpretation approaches.

  19. Exploring the Relationship Between Eye Movements and Electrocardiogram Interpretation Accuracy

    PubMed Central

    Davies, Alan; Brown, Gavin; Vigo, Markel; Harper, Simon; Horseman, Laura; Splendiani, Bruno; Hill, Elspeth; Jay, Caroline

    2016-01-01

    Interpretation of electrocardiograms (ECGs) is a complex task involving visual inspection. This paper aims to improve understanding of how practitioners perceive ECGs, and determine whether visual behaviour can indicate differences in interpretation accuracy. A group of healthcare practitioners (n = 31) who interpret ECGs as part of their clinical role were shown 11 commonly encountered ECGs on a computer screen. The participants’ eye movement data were recorded as they viewed the ECGs and attempted interpretation. The Jensen-Shannon distance was computed for the distance between two Markov chains, constructed from the transition matrices (visual shifts from and to ECG leads) of the correct and incorrect interpretation groups for each ECG. A permutation test was then used to compare this distance against 10,000 randomly shuffled groups made up of the same participants. The results demonstrated a statistically significant (α  0.05) result in 5 of the 11 stimuli demonstrating that the gaze shift between the ECG leads is different between the groups making correct and incorrect interpretations and therefore a factor in interpretation accuracy. The results shed further light on the relationship between visual behaviour and ECG interpretation accuracy, providing information that can be used to improve both human and automated interpretation approaches. PMID:27917921

  20. Interpretable Decision Sets: A Joint Framework for Description and Prediction

    PubMed Central

    Lakkaraju, Himabindu; Bach, Stephen H.; Jure, Leskovec

    2016-01-01

    One of the most important obstacles to deploying predictive models is the fact that humans do not understand and trust them. Knowing which variables are important in a model’s prediction and how they are combined can be very powerful in helping people understand and trust automatic decision making systems. Here we propose interpretable decision sets, a framework for building predictive models that are highly accurate, yet also highly interpretable. Decision sets are sets of independent if-then rules. Because each rule can be applied independently, decision sets are simple, concise, and easily interpretable. We formalize decision set learning through an objective function that simultaneously optimizes accuracy and interpretability of the rules. In particular, our approach learns short, accurate, and non-overlapping rules that cover the whole feature space and pay attention to small but important classes. Moreover, we prove that our objective is a non-monotone submodular function, which we efficiently optimize to find a near-optimal set of rules. Experiments show that interpretable decision sets are as accurate at classification as state-of-the-art machine learning techniques. They are also three times smaller on average than rule-based models learned by other methods. Finally, results of a user study show that people are able to answer multiple-choice questions about the decision boundaries of interpretable decision sets and write descriptions of classes based on them faster and more accurately than with other rule-based models that were designed for interpretability. Overall, our framework provides a new approach to interpretable machine learning that balances accuracy, interpretability, and computational efficiency. PMID:27853627

  1. Interpretable Decision Sets: A Joint Framework for Description and Prediction.

    PubMed

    Lakkaraju, Himabindu; Bach, Stephen H; Jure, Leskovec

    2016-08-01

    One of the most important obstacles to deploying predictive models is the fact that humans do not understand and trust them. Knowing which variables are important in a model's prediction and how they are combined can be very powerful in helping people understand and trust automatic decision making systems. Here we propose interpretable decision sets, a framework for building predictive models that are highly accurate, yet also highly interpretable. Decision sets are sets of independent if-then rules. Because each rule can be applied independently, decision sets are simple, concise, and easily interpretable. We formalize decision set learning through an objective function that simultaneously optimizes accuracy and interpretability of the rules. In particular, our approach learns short, accurate, and non-overlapping rules that cover the whole feature space and pay attention to small but important classes. Moreover, we prove that our objective is a non-monotone submodular function, which we efficiently optimize to find a near-optimal set of rules. Experiments show that interpretable decision sets are as accurate at classification as state-of-the-art machine learning techniques. They are also three times smaller on average than rule-based models learned by other methods. Finally, results of a user study show that people are able to answer multiple-choice questions about the decision boundaries of interpretable decision sets and write descriptions of classes based on them faster and more accurately than with other rule-based models that were designed for interpretability. Overall, our framework provides a new approach to interpretable machine learning that balances accuracy, interpretability, and computational efficiency.

  2. Structure of wet specimens in electron microscopy. Improved environmental chambers make it possible to examine wet specimens easily.

    PubMed

    Parsons, D F

    1974-11-01

    Several recent technological advances have increased the practicality and usefulness of the technique of electron microscopy of wet objects. (i) There have been gains in the effective penetration of high-voltage microscopes, scanning transmission microscopes, and high-voltage scanning microscopes. The extra effective penetration gives more scope for obtaining good images through film windows, gas, and liquid layers. (ii) Improved methods of obtaining contrast are available (especially dark field and inelastic filtering) that often make it possible to obtain sufficient contrast with wet unstained objects. (iii) Improved environmental chamber design makes it possible to insert and examine wet specimens as easily as dry specimens. The ultimate achievable resolution for wet objects in an environmental chamber will gradually become clear experimentally. Resolution is mainly a function of gas path, liquid and wet specimen thickness, specimen stage stability, acceleration voltage, and image mode (fixed or scanning beam) (13). Much depends on the development of the technique for controlling the thickness of extraneous water film around wet objects or the technique for depositing wet objects onto dry, hydrophobic support films. Although some loss of resolution due to water or gas scattering will always occur, an effective gain is anticipated in preserving the shape of individual molecules and preventing the partial collapse that usually occurs on drying or negative staining. The most basic question for biological electron microscopy is probably whether any living functions of cells can be observed so that the capabilities of the phase contrast and interference light microscopes can be extended. Investigators are now rapidly approaching a final answer to this question. The two limiting factors are (i) maintaining cell motility in spread cells immersed in thin layers of media and (ii) reducing beam radiation damage to an acceptable level. The use of sensitive emulsions and

  3. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  4. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  5. Components of Simultaneous Interpreting: Comparing Interpreting with Shadowing and Paraphrasing

    ERIC Educational Resources Information Center

    Christoffels, Ingrid K.; de Groot, Annette M. B.

    2004-01-01

    Simultaneous interpreting is a complex task where the interpreter is routinely involved in comprehending, translating and producing language at the same time. This study assessed two components that are likely to be major sources of complexity in SI: The simultaneity of comprehension and production, and transformation of the input. Furthermore,…

  6. Geochemical Interpretation of Collision Volcanism

    NASA Astrophysics Data System (ADS)

    Pearce, Julian

    2014-05-01

    Collision volcanism can be defined as volcanism that takes place during an orogeny from the moment that continental subduction starts to the end of orogenic collapse. Its importance in the Geological Record is greatly underestimated as collision volcanics are easily misinterpreted as being of volcanic arc, extensional or mantle plume origin. There are many types of collision volcanic province: continent-island arc collision (e.g. Banda arc); continent-active margin collision (e.g. Tibet, Turkey-Iran); continent-rear-arc collision (e.g. Bolivia); continent-continent collision (e.g. Tuscany); and island arc-island arc collision (e.g. Taiwan). Superimposed on this variability is the fact that every orogeny is different in detail. Nonetheless, there is a general theme of cyclicity on different time scales. This starts with syn-collision volcanism resulting from the subduction of an ocean-continent transition and continental lithosphere, and continues through post-collision volcanism. The latter can be subdivided into orogenic volcanism, which is related to thickened crust, and post-orogenic, which is related to orogenic collapse. Typically, but not always, collision volcanism is preceded by normal arc volcanism and followed by normal intraplate volcanism. Identification and interpretation of collision volcanism in the Geologic Record is greatly facilitated if a dated stratigraphic sequence is present so that the petrogenic evolution can be traced. In any case, the basis of fingerprinting collision terranes is to use geochemical proxies for mantle and subduction fluxes, slab temperatures, and depths and degrees of melting. For example, syn-collision volcanism is characterized by a high subduction flux relative to mantle flux because of the high input flux of fusible sediment and crust coupled with limited mantle flow, and because of high slab temperatures resulting from the decrease in subduction rate. The resulting geochemical patterns are similar regardless of

  7. Interpreting Recoil for Undergraduate Students

    ERIC Educational Resources Information Center

    Elsayed, Tarek A.

    2012-01-01

    The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is…

  8. Interpretative Decision Making in Research.

    ERIC Educational Resources Information Center

    Maitland-Gholson, Jane; Ettinger, Linda F.

    1994-01-01

    Asserts that all research is interpretive and that reality is constructed at every stage of a research project. Explores three research constructs: (1) role of the researcher; (2) research questions; and (3) underlying assumptions of the researcher. Presents and applies a foundation for an interpretative research framework. (CFR)

  9. Museum Docents' Understanding of Interpretation

    ERIC Educational Resources Information Center

    Neill, Amanda C.

    2010-01-01

    The purpose of this qualitative research study was to explore docents' perceptions of their interpretive role in art museums and determine how those perceptions shape docents' practice. The objective was to better understand how docents conceive of their role and what shapes the interpretation they give on tours to the public. The conceptual…

  10. Judging Dramatic Interpretation: Textual Considerations.

    ERIC Educational Resources Information Center

    Manchester, Bruce B.

    The recent growth in popularity among college students of dramatic interpretation in forensic competition justifies an examination of textual considerations and resultant criteria important to the evaluation of dramatic literature. The first considerations of the student contemplating the dramatic interpretation event are the selection of material…

  11. Remote sensing and image interpretation

    NASA Technical Reports Server (NTRS)

    Lillesand, T. M.; Kiefer, R. W. (Principal Investigator)

    1979-01-01

    A textbook prepared primarily for use in introductory courses in remote sensing is presented. Topics covered include concepts and foundations of remote sensing; elements of photographic systems; introduction to airphoto interpretation; airphoto interpretation for terrain evaluation; photogrammetry; radiometric characteristics of aerial photographs; aerial thermography; multispectral scanning and spectral pattern recognition; microwave sensing; and remote sensing from space.

  12. Human Erythrocyte PIG-A Assay: An Easily Monitored Index of Gene Mutation Requiring Low Volume Blood Samples

    PubMed Central

    Dertinger, Stephen D.; Avlasevich, Svetlana L.; Bemis, Jeffrey C.; Chen, Yuhchyau; MacGregor, James T.

    2015-01-01

    This laboratory has previously described a method for scoring the incidence of rodent blood Pig-a mutant phenotype erythrocytes using immunomag-netic separation in conjunction with flow cytometric analysis (In Vivo MutaFlow®). The current work extends this approach to human blood. The frequencies of CD59- and CD55-negative reticulo-cytes (RETCD59−/CD55−) and erythrocytes (RBCCD59−/CD55−) seve as phenotypic reporters of PIG-A gene mutation. Immunomagnetic separation was found to provide an effective means of increasing the number of reticulocytes and erythro-cytes evaluated. Technical replicates were utilized to provide a sufficient number of cells for precise scoring while at the same time controlling for procedural accuracy by allowing comparison of replicate values. Cold whole blood samples could be held for at least one week without affecting reticulo-cyte, RETCD59−/CD55− or RBCCD59−/CD55− frequencies. Specimens from a total of 52 nonsmoking, self-reported healthy adult subjects were evaluated. The mean frequency of RETCD59−/CD55− and RBCCD592−/CD55− were 6.0 × 10−6 and 2.9 × 10−6, respectively. The difference is consistent with a modest selective pressure against mutant phenotype erythrocytes in the circulation, and suggests advantages of studying both populations of erythrocytes. Whereas intra-subject variability was low, inter-subject variability was relatively high, with RETCD59−/CD55− frequencies differing by more than 30-fold. There was an apparent correlation between age and mutant cell frequencies. Taken together, the results indicate that the frequency of human PIG-A mutant phenotype cells can be efficiently and reliably estimated using a labeling and analysis protocol that is well established for rodent-based studies. The applicability of the assay across species, its simplicity and statistical power, and the relatively non-invasive nature of the assay should benefit myriad research areas involving DNA damage

  13. A4-4: Visualizing VDW Lab Results Data: Why You Should, and How You Can - Easily!

    PubMed Central

    Pardee, Roy

    2013-01-01

    Background/Aims One of the most difficult tasks facing the VDW lab results file implementer is figuring out which records from local lab data should be included in their VDW file, and which should be left out. These decisions frequently hinge on neat points of clinical science that are often outside the expertise of the programmer. We will describe a set of graphics that implementers can use to shed light on these decisions, useful during implementation, and afterwards as indication of data quality and even as documentation. Methods SAS’ new Statistical Graphics procedures allow unprecedented control and ease of use in the creation of descriptive graphs and charts. The ODS Graphics Designer utility, paired with the SGDESIGN procedure make it easy to create a single image composed of multiple different graphs, each of which can use its own dataset. These tools allowed us to create something of a “data report card” for each VDW test_type, depicting: (1) number of result records over time; (2) number of result values that are numeric vs. character, stratified by the local lab codes used; 3) distributions of numeric result values, by unit; (4) number of numeric result values, by unit and local lab code; (5) values of character results by local lab code. The report card is produced by a standard VDW program available from the author, which can be run at any VDW lab results-implementing site with access to SAS version 9.2 or greater. Results The graphics produced by this program allow both implementers and end-users to evaluate at a glance, how cohesive the data from various different local codes are, how much data there is, how it waxes and wanes over time, whether the values are of the expected types, and whether units and character values are within valid values. Being able to depict all of this disparate information in a single, compact display allows users to glean insights that, for example, viewing series of larger graphs separately would not afford

  14. Contemporary Interpretations of Robert Frank's "The Americans."

    ERIC Educational Resources Information Center

    Nesterenko, Alexander; Smith, C. Zoe

    1984-01-01

    Examines interpretation of Robert Frank's photographic essay (1) to discern the experiences evoked by the essay, (2) to establish the relationship between "projected" and "stated" interpretations, and (3) to determine the extent to which "stated" interpretations resemble "ideal" interpretations. (FL)

  15. Mathematical and statistical analysis

    NASA Technical Reports Server (NTRS)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  16. Uterine Cancer Statistics

    MedlinePlus

    ... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...

  17. Experiment in Elementary Statistics

    ERIC Educational Resources Information Center

    Fernando, P. C. B.

    1976-01-01

    Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)

  18. GUIDELINES FOR INTERPRETING RETINAL PHOTOGRAPHS AND CODING FINDINGS IN THE SUBMACULAR SURGERY TRIALS (SST)

    PubMed Central

    2005-01-01

    mottling of the retinal pigment epithelium with a subtle transition to normal retinal pigment epithelium or a very sharply demarcated, markedly hypopigmented area that was easily distinguished from the surrounding retinal pigment epithelium. κ statistics for interobserver reliability ranged from good (0.47) to excellent (1.00) for features graded at baseline and follow-up. Conclusions Although some of the definitions essential to the interpretation of the SST are similar to those used in the Macular Photocoagulation Study and randomized clinical trials of photodynamic therapy with verteporfin, this guideline provides new information regarding lesion components at baseline as well as standardized descriptions of lesions after submacular surgery. These descriptions from the SST assist in understanding what lesions were studied, when additional treatment was considered after surgery, and how anatomical results should be interpreted. PMID:15805900

  19. Interpreter services in pediatric nursing.

    PubMed

    Lehna, Carlee

    2005-01-01

    A critical part of every encounter between a pediatric nurse and a patient is obtaining accurate patient information. Unique obstacles are encountered when patients and their families have little or no understanding of the English language. Federal and state laws require health care systems that receive governmental funds to provide full language access to services. Both legal and ethical issues can arise when caring for non-English-speaking patients. Often, obtaining accurate patient information and a fully informed consent cannot be done without the use of an interpreter. The interpreter informs the family of all the risks and benefits of a specific avenue of care. When inappropriate interpreter services are used, such as when children in the family or other family members act as interpreters, concerns about accuracy, confidentiality, cultural congruency, and other issues may arise. The purpose of this article is to: (a) explore principles related to the use of medical interpreters, (b) examine different models of interpreter services, and (c) identify available resources to assist providers in accessing interpreter services (e.g., books, online resources, articles, and videos). The case study format will be used to illustrate key points.

  20. QUANTIFICATION AND INTERPRETATION OF TOTAL PETROLEUM HYDROCARBONS IN SEDIMENT SAMPLES BY A GC/MS METHOD AND COMPARISON WITH EPA 418.1 AND A RAPID FIELD METHOD

    EPA Science Inventory

    ABSTRACT: Total Petroleum hydrocarbons (TPH) as a lumped parameter can be easily and rapidly measured or monitored. Despite interpretational problems, it has become an accepted regulatory benchmark used widely to evaluate the extent of petroleum product contamination. Three cu...

  1. Ethics in Statistics

    ERIC Educational Resources Information Center

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  2. Teaching Statistics Using SAS.

    ERIC Educational Resources Information Center

    Mandeville, Garrett K.

    The Statistical Analysis System (SAS) is presented as the single most appropriate statistical package to use as an aid in teaching statistics. A brief review of literature in which SAS is compared to SPSS, BMDP, and other packages is followed by six examples which demonstrate features unique to SAS which have pedagogical utility. Of particular…

  3. Minnesota Health Statistics 1988.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Health, St. Paul.

    This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…

  4. Mining significant substructure pairs for interpreting polypharmacology in drug-target network.

    PubMed

    Takigawa, Ichigaku; Tsuda, Koji; Mamitsuka, Hiroshi

    2011-02-23

    A current key feature in drug-target network is that drugs often bind to multiple targets, known as polypharmacology or drug promiscuity. Recent literature has indicated that relatively small fragments in both drugs and targets are crucial in forming polypharmacology. We hypothesize that principles behind polypharmacology are embedded in paired fragments in molecular graphs and amino acid sequences of drug-target interactions. We developed a fast, scalable algorithm for mining significantly co-occurring subgraph-subsequence pairs from drug-target interactions. A noteworthy feature of our approach is to capture significant paired patterns of subgraph-subsequence, while patterns of either drugs or targets only have been considered in the literature so far. Significant substructure pairs allow the grouping of drug-target interactions into clusters, covering approximately 75% of interactions containing approved drugs. These clusters were highly exclusive to each other, being statistically significant and logically implying that each cluster corresponds to a distinguished type of polypharmacology. These exclusive clusters cannot be easily obtained by using either drug or target information only but are naturally found by highlighting significant substructure pairs in drug-target interactions. These results confirm the effectiveness of our method for interpreting polypharmacology in drug-target network.

  5. Students' Interpretation of a Function Associated with a Real-Life Problem from Its Graph

    ERIC Educational Resources Information Center

    Mahir, Nevin

    2010-01-01

    The properties of a function such as limit, continuity, derivative, growth, or concavity can be determined more easily from its graph than by doing any algebraic operation. For this reason, it is important for students of mathematics to interpret some of the properties of a function from its graph. In this study, we investigated the competence of…

  6. Statistical Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.; Babu, G. Jogesh

    Statistical methodology, with deep roots in probability theory, providesquantitative procedures for extracting scientific knowledge from astronomical dataand for testing astrophysical theory. In recent decades, statistics has enormouslyincreased in scope and sophistication. After a historical perspective, this reviewoutlines concepts of mathematical statistics, elements of probability theory,hypothesis tests, and point estimation. Least squares, maximum likelihood, andBayesian approaches to statistical inference are outlined. Resampling methods,particularly the bootstrap, provide valuable procedures when distributionsfunctions of statistics are not known. Several approaches to model selection andgoodness of fit are considered.

  7. Faculty Salary Equity Cases: Combining Statistics with the Law

    ERIC Educational Resources Information Center

    Luna, Andrew L.

    2006-01-01

    Researchers have used many statistical models to determine whether an institution's faculty pay structure is equitable, with varying degrees of success. Little attention, however, has been given to court interpretations of statistical significance or to what variables courts have acknowledged should be used in an equity model. This article…

  8. Chi-Square Statistics, Tests of Hypothesis and Technology.

    ERIC Educational Resources Information Center

    Rochowicz, John A.

    The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…

  9. The Effect Size Statistic: Overview of Various Choices.

    ERIC Educational Resources Information Center

    Mahadevan, Lakshmi

    Over the years, methodologists have been recommending that researchers use magnitude of effect estimates in result interpretation to highlight the distinction between statistical and practical significance (cf. R. Kirk, 1996). A magnitude of effect statistic (i.e., effect size) tells to what degree the dependent variable can be controlled,…

  10. ALISE Library and Information Science Education Statistical Report, 1999.

    ERIC Educational Resources Information Center

    Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.

    This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by…

  11. What Is the next Trend in Usage Statistics in Libraries?

    ERIC Educational Resources Information Center

    King, Douglas

    2009-01-01

    In answering the question "What is the next trend in usage statistics in libraries?" an eclectic group of respondents has presented an assortment of possibilities, suggestions, complaints and, of course, questions of their own. Undoubtedly, usage statistics collection, interpretation, and application are areas of growth and increasing complexity…

  12. The Role of the Sampling Distribution in Understanding Statistical Inference

    ERIC Educational Resources Information Center

    Lipson, Kay

    2003-01-01

    Many statistics educators believe that few students develop the level of conceptual understanding essential for them to apply correctly the statistical techniques at their disposal and to interpret their outcomes appropriately. It is also commonly believed that the sampling distribution plays an important role in developing this understanding.…

  13. ENVIRONMENTAL PHOTOGRAPHIC INTERPRETATION CENTER (EPIC)

    EPA Science Inventory

    The Environmental Sciences Division (ESD) in the National Exposure Research Laboratory (NERL) of the Office of Research and Development provides remote sensing technical support including aerial photograph acquisition and interpretation to the EPA Program Offices, ORD Laboratorie...

  14. Car Troubles: An Interpretive Approach.

    ERIC Educational Resources Information Center

    Dawson, Leslie

    1995-01-01

    The growing amount of U.S. surface area being paved increases interpretive opportunities for teaching about the environmental impacts of automobiles. Provides methods and suggestions for educating high school students. Provides several computer graphics. (LZ)

  15. Securing wide appreciation of health statistics

    PubMed Central

    Pyrrait, A. M. DO Amaral; Aubenque, M. J.; Benjamin, B.; DE Groot, Meindert J. W.; Kohn, R.

    1954-01-01

    All the authors are agreed on the need for a certain publicizing of health statistics, but do Amaral Pyrrait points out that the medical profession prefers to convince itself rather than to be convinced. While there is great utility in articles and reviews in the professional press (especially for paramedical personnel) Aubenque, de Groot, and Kohn show how appreciation can effectively be secured by making statistics more easily understandable to the non-expert by, for instance, including readable commentaries in official publications, simplifying charts and tables, and preparing simple manuals on statistical methods. Aubenque and Kohn also stress the importance of linking health statistics to other economic and social information. Benjamin suggests that the principles of market research could to advantage be applied to health statistics to determine the precise needs of the “consumers”. At the same time, Aubenque points out that the value of the ultimate results must be clear to those who provide the data; for this, Kohn suggests that the enumerators must know exactly what is wanted and why. There is general agreement that some explanation of statistical methods and their uses should be given in the curricula of medical schools and that lectures and postgraduate courses should be arranged for practising physicians. PMID:13199668

  16. An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics

    ERIC Educational Resources Information Center

    Ellis, Frank B.; Ellis, David C.

    2008-01-01

    Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…

  17. Students' Misconceptions of Statistical Inference: A Review of the Empirical Evidence from Research on Statistics Education

    ERIC Educational Resources Information Center

    Sotos, Ana Elisa Castro; Vanhoof, Stijn; Van den Noortgate, Wim; Onghena, Patrick

    2007-01-01

    A solid understanding of "inferential statistics" is of major importance for designing and interpreting empirical results in any scientific discipline. However, students are prone to many misconceptions regarding this topic. This article structurally summarizes and describes these misconceptions by presenting a systematic review of publications…

  18. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition

    PubMed Central

    Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine

    2016-01-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466

  19. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition

    NASA Astrophysics Data System (ADS)

    Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine

    2016-08-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.

  20. Measuring statistical heterogeneity: The Pietra index

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo I.; Sokolov, Igor M.

    2010-01-01

    There are various ways of quantifying the statistical heterogeneity of a given probability law: Statistics uses variance - which measures the law’s dispersion around its mean; Physics and Information Theory use entropy - which measures the law’s randomness; Economics uses the Gini index - which measures the law’s egalitarianism. In this research we explore an alternative to the Gini index-the Pietra index-which is a counterpart of the Kolmogorov-Smirnov statistic. The Pietra index is shown to be a natural and elemental measure of statistical heterogeneity, which is especially useful in the case of asymmetric and skewed probability laws, and in the case of asymptotically Paretian laws with finite mean and infinite variance. Moreover, the Pietra index is shown to have immediate and fundamental interpretations within the following applications: renewal processes and continuous time random walks; infinite-server queueing systems and shot noise processes; financial derivatives. The interpretation of the Pietra index within the context of financial derivatives implies that derivative markets, in effect, use the Pietra index as their benchmark measure of statistical heterogeneity.

  1. Optical rogue wave statistics in laser filamentation.

    PubMed

    Kasparian, Jérôme; Béjot, Pierre; Wolf, Jean-Pierre; Dudley, John M

    2009-07-06

    We experimentally observed optical rogue wave statistics during high power femtosecond pulse filamentation in air. We characterized wavelength-dependent intensity fluctuations across 300 nm broadband filament spectra generated by pulses with several times the critical power for filamentation. We show how the statistics vary from a near-Gaussian distribution in the vicinity of the pump to a long tailed "L-shaped" distribution at the short wavelength and long wavelength edges. The results are interpreted in terms of pump noise transfer via self-phase modulation.

  2. Intelligent Collection Environment for an Interpretation System

    SciTech Connect

    Maurer, W J

    2001-07-19

    An Intelligent Collection Environment for a data interpretation system is described. The environment accepts two inputs: A data model and a number between 0.0 and 1.0. The data model is as simple as a single word or as complex as a multi-level/multidimensional model. The number between 0.0 and 1.0 is a control knob to indicate the user's desire to allow loose matching of the data (things are ambiguous and unknown) versus strict matching of the data (things are precise and known). The environment produces a set of possible interpretations, a set of requirements to further strengthen or to differentiate a particular subset of the possible interpretation from the others, a set of inconsistencies, and a logic map that graphically shows the lines of reasoning used to derive the above output. The environment is comprised of a knowledge editor, model explorer, expertise server, and the World Wide Web. The Knowledge Editor is used by a subject matter expert to define Linguistic Types, Term Sets, detailed explanations, and dynamically created URI's, and to create rule bases using a straight forward hyper matrix representation. The Model Explorer allows rapid construction and browsing of multi-level models. A multi-level model is a model whose elements may also be models themselves. The Expertise Server is an inference engine used to interpret the data submitted. It incorporates a semantic network knowledge representation, an assumption based truth maintenance system, and a fuzzy logic calculus. It can be extended by employing any classifier (e.g. statistical/neural networks) of complex data types. The World Wide Web is an unstructured data space accessed by the URI's supplied as part of the output of the environment. By recognizing the input data model as a query, the environment serves as a deductive search engine. Applications include (but are not limited to) interpretation of geophysical phenomena, a navigation aid for very large web sites, monitoring of computer or

  3. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    PubMed Central

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory causal indicators are controversial and little-understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning intended by a researcher. This article questions the validity of evidence used to claim that causal indicators are inherently susceptible to interpretational confounding. Further, a simulation study demonstrates that causal indicator coefficients are stable across correctly-specified models. Determining the suitability of causal indicators has implications for the way we conceptualize measurement and build and evaluate measurement models. PMID:25530730

  4. Heroin: Statistics and Trends

    MedlinePlus

    ... Naloxone Pain Prevention Treatment Trends & Statistics Women and Drugs Publications Funding Funding Opportunities Clinical Research Post-Award Concerns General Information Grant & Contract Application ...

  5. Statistical distribution sampling

    NASA Technical Reports Server (NTRS)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  6. Teaching Business Statistics with Real Data to Undergraduates and the Use of Technology in the Class Room

    ERIC Educational Resources Information Center

    Singamsetti, Rao

    2007-01-01

    In this paper an attempt is made to highlight some issues of interpretation of statistical concepts and interpretation of results as taught in undergraduate Business statistics courses. The use of modern technology in the class room is shown to have increased the efficiency and the ease of learning and teaching in statistics. The importance of…

  7. The standard map: From Boltzmann-Gibbs statistics to Tsallis statistics

    NASA Astrophysics Data System (ADS)

    Tirnakli, Ugur; Borges, Ernesto P.

    2016-03-01

    As well known, Boltzmann-Gibbs statistics is the correct way of thermostatistically approaching ergodic systems. On the other hand, nontrivial ergodicity breakdown and strong correlations typically drag the system into out-of-equilibrium states where Boltzmann-Gibbs statistics fails. For a wide class of such systems, it has been shown in recent years that the correct approach is to use Tsallis statistics instead. Here we show how the dynamics of the paradigmatic conservative (area-preserving) stan-dard map exhibits, in an exceptionally clear manner, the crossing from one statistics to the other. Our results unambiguously illustrate the domains of validity of both Boltzmann-Gibbs and Tsallis statistical distributions. Since various important physical systems from particle confinement in magnetic traps to autoionization of molecular Rydberg states, through particle dynamics in accelerators and comet dynamics, can be reduced to the standard map, our results are expected to enlighten and enable an improved interpretation of diverse experimental and observational results.

  8. The standard map: From Boltzmann-Gibbs statistics to Tsallis statistics.

    PubMed

    Tirnakli, Ugur; Borges, Ernesto P

    2016-03-23

    As well known, Boltzmann-Gibbs statistics is the correct way of thermostatistically approaching ergodic systems. On the other hand, nontrivial ergodicity breakdown and strong correlations typically drag the system into out-of-equilibrium states where Boltzmann-Gibbs statistics fails. For a wide class of such systems, it has been shown in recent years that the correct approach is to use Tsallis statistics instead. Here we show how the dynamics of the paradigmatic conservative (area-preserving) stan-dard map exhibits, in an exceptionally clear manner, the crossing from one statistics to the other. Our results unambiguously illustrate the domains of validity of both Boltzmann-Gibbs and Tsallis statistical distributions. Since various important physical systems from particle confinement in magnetic traps to autoionization of molecular Rydberg states, through particle dynamics in accelerators and comet dynamics, can be reduced to the standard map, our results are expected to enlighten and enable an improved interpretation of diverse experimental and observational results.

  9. 25 CFR 81.16 - Interpreters.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Interpreters. 81.16 Section 81.16 Indians BUREAU OF... STATUTE § 81.16 Interpreters. Interpreters, where needed, may be provided to explain the manner of voting... that the interpreter does not influence the voter in casting the ballot. The interpreter may...

  10. 25 CFR 81.16 - Interpreters.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 1 2012-04-01 2011-04-01 true Interpreters. 81.16 Section 81.16 Indians BUREAU OF INDIAN....16 Interpreters. Interpreters, where needed, may be provided to explain the manner of voting to any... interpreter does not influence the voter in casting the ballot. The interpreter may accompany the voter...

  11. 25 CFR 81.16 - Interpreters.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false Interpreters. 81.16 Section 81.16 Indians BUREAU OF... STATUTE § 81.16 Interpreters. Interpreters, where needed, may be provided to explain the manner of voting... that the interpreter does not influence the voter in casting the ballot. The interpreter may...

  12. 25 CFR 81.16 - Interpreters.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false Interpreters. 81.16 Section 81.16 Indians BUREAU OF... STATUTE § 81.16 Interpreters. Interpreters, where needed, may be provided to explain the manner of voting... that the interpreter does not influence the voter in casting the ballot. The interpreter may...

  13. 25 CFR 81.16 - Interpreters.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Interpreters. 81.16 Section 81.16 Indians BUREAU OF... STATUTE § 81.16 Interpreters. Interpreters, where needed, may be provided to explain the manner of voting... that the interpreter does not influence the voter in casting the ballot. The interpreter may...

  14. How do I interpret a confidence interval?

    PubMed

    O'Brien, Sheila F; Yi, Qi Long

    2016-07-01

    A 95% confidence interval (CI) of the mean is a range with an upper and lower number calculated from a sample. Because the true population mean is unknown, this range describes possible values that the mean could be. If multiple samples were drawn from the same population and a 95% CI calculated for each sample, we would expect the population mean to be found within 95% of these CIs. CIs are sensitive to variability in the population (spread of values) and sample size. When used to compare the means of two or more treatment groups, a CI shows the magnitude of a difference between groups. This is helpful in understanding both the statistical significance and the clinical significance of a treatment. In this article we describe the basic principles of CIs and their interpretation.

  15. A Road More Easily Traveled

    ERIC Educational Resources Information Center

    Stanly, Pat

    2009-01-01

    Rough patches occur at both ends of the education pipeline, as students enter community colleges and move on to work or enrollment in four-year institutions. Career pathways--sequences of coherent, articulated, and rigorous career and academic courses that lead to an industry-recognized certificate or a college degree--are a promising approach to…

  16. An Interpretation of Banded Magnetospheric Radio Emissions

    NASA Technical Reports Server (NTRS)

    Benson, Robert F.; Osherovich, V. A.; Fainberg, J.; Vinas, A. F.; Ruppert, D. R.; Vondrak, Richard R. (Technical Monitor)

    2000-01-01

    Recently-published Active Magnetospheric Particle Tracer Explorer/Isothermal Remanent Magnetization (AMPTE/IRM) banded magnetospheric emissions, commonly referred to as '(n + 1/2)f(sub ce)' emissions where f(sub ce) is the electron gyrofrequency, are analyzed by treating them as analogous to sounder-stimulated ionospheric emissions. We show that both individual AMPTE/IRM spectra of magnetospheric banded emissions, and a statistically-derived spectra observed over the two-year lifetime of the mission, can be interpreted in a self-consistent manner. The analysis, which predicts all spectral peaks within 4% of the observed peaks, interprets the higher-frequency emissions as due to low group-velocity Bernstein-mode waves and the lower-frequency emissions as eigen modes of cylindrical-electromagnetic-plasma-oscillations. The demarcation between these two classes of emissions is the electron plasma frequency f(sub pe), where an emission is often observed. This f(sub pe), emission is not necessarily the strongest. None of the observed banded emissions were attributed to the upper-hybrid frequency. We present Alouette-2 and ISIS-1 plasma-resonance data, and model electron temperature (T(sub e)) values, to support the argument that the frequency-spectrum of ionospheric sounder-stimulated emissions is not strongly temperature dependent and thus that the interpretation of these emissions in the ionosphere is relevant to other plasmas (such as the magnetosphere) where N(sub e) and T(sub e) can be quite different but where the ratio f(sub pe)/f(sub ce) is identical.

  17. Narrative pedagogy and art interpretation.

    PubMed

    Ewing, Bonnie; Hayden-Miles, Marie

    2011-04-01

    Contemporary practices in nursing education call for changes that will assist students in understanding a complex, rapidly changing world. Narrative pedagogy is an approach that offers teachers a way to actively engage students in the process of teaching and learning. The narrative approach provides ways to think critically, make connections, and ask questions to gain understanding through dialogue. The hermeneutic circle of understanding offers a way to interpret stories and discover meaning. Narratives exist in art forms that can be interpreted to evoke discussions and thinking that relate to nursing practice. Art interpretation is a way to gain access to others and acquire a deeper appreciation for multiple perspectives in the teaching-learning process.

  18. Default Sarcastic Interpretations: On the Priority of Nonsalient Interpretations

    ERIC Educational Resources Information Center

    Giora, Rachel; Drucker, Ari; Fein, Ofer; Mendelson, Itamar

    2015-01-01

    Findings from five experiments support the view that negation generates sarcastic utterance-interpretations by default. When presented in isolation, novel negative constructions ("Punctuality is not his forte," "Thoroughness is not her most distinctive feature"), free of semantic anomaly or internal incongruity, were…

  19. The Interpretive Approach to Religious Education: Challenging Thompson's Interpretation

    ERIC Educational Resources Information Center

    Jackson, Robert

    2012-01-01

    In a recent book chapter, Matthew Thompson makes some criticisms of my work, including the interpretive approach to religious education and the research and activity of Warwick Religions and Education Research Unit. Against the background of a discussion of religious education in the public sphere, my response challenges Thompson's account,…

  20. SCIENCE INTERPRETIVE PROGRAM--SPERMACETI COVE INTERPRETIVE CENTER.

    ERIC Educational Resources Information Center

    COLE, RICHARD C.

    DESCRIBED IS THE OUTDOOR EDUCATION PROGRAM FOR THE MIDDLETOWN, NEW JERSEY ELEMENTARY SCHOOLS AT THE SPERMACETI COVE INTERPRETIVE CENTER IN SANDY HOOK STATE PARK. THE PROGRAM IS FUNDED UNDER PL89-10 OF THE ELEMENTARY AND SECONDARY EDUCATION ACT (ESEA). PHASE 1 (MARCH, 1966-JUNE, 1966) INVOLVED THE SELECTION OF NINE PUBLIC AND THREE PAROCHIAL FOURTH…

  1. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning…

  2. Automatic interpretation of Schlumberger soundings

    SciTech Connect

    Ushijima, K.

    1980-09-01

    The automatic interpretation of apparent resistivity curves from horizontally layered earth models is carried out by the curve-fitting method in three steps: (1) the observed VES data are interpolated at equidistant points of electrode separations on the logarithmic scale by using the cubic spline function, (2) the layer parameters which are resistivities and depths are predicted from the sampled apparent resistivity values by SALS system program and (3) the theoretical VES curves from the models are calculated by Ghosh's linear filter method using the Zhody's computer program. Two soundings taken over Takenoyu geothermal area were chosen to test the procedures of the automatic interpretation.

  3. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four…

  4. Teaching Statistics without Sadistics.

    ERIC Educational Resources Information Center

    Forte, James A.

    1995-01-01

    Five steps designed to take anxiety out of statistics for social work students are outlined. First, statistics anxiety is identified as an educational problem. Second, instructional objectives and procedures to achieve them are presented and methods and tools for evaluating the course are explored. Strategies for, and obstacles to, making…

  5. STATSIM: Exercises in Statistics.

    ERIC Educational Resources Information Center

    Thomas, David B.; And Others

    A computer-based learning simulation was developed at Florida State University which allows for high interactive responding via a time-sharing terminal for the purpose of demonstrating descriptive and inferential statistics. The statistical simulation (STATSIM) is comprised of four modules--chi square, t, z, and F distribution--and elucidates the…

  6. Understanding Undergraduate Statistical Anxiety

    ERIC Educational Resources Information Center

    McKim, Courtney

    2014-01-01

    The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…

  7. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  8. Towards Statistically Undetectable Steganography

    DTIC Science & Technology

    2011-06-30

    Statistically Undciectable Steganography 5a. CONTRACT NUMBER FA9550-08-1-0084 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Prof. Jessica...approved for public release: distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Fundamental asymptotic laws for imperfect steganography ...formats. 15. SUBJECT TERMS Steganography . covert communication, statistical detectability. asymptotic performance, secure pay load, minimum

  9. Explorations in Statistics: Regression

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2011-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive…

  10. Option Y, Statistics.

    ERIC Educational Resources Information Center

    Singer, Arlene

    This guide outlines a one semester Option Y course, which has seven learner objectives. The course is designed to provide students with an introduction to the concerns and methods of statistics, and to equip them to deal with the many statistical matters of importance to society. Topics covered include graphs and charts, collection and…

  11. On Statistical Testing.

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…

  12. Statistics and Measurements

    PubMed Central

    Croarkin, M. Carroll

    2001-01-01

    For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST. PMID:27500023

  13. [Statistics quantum satis].

    PubMed

    Pestana, Dinis

    2013-01-01

    Statistics is a privileged tool in building knowledge from information, since the purpose is to extract from a sample limited information conclusions to the whole population. The pervasive use of statistical software (that always provides an answer, the question being adequate or not), and the absence of statistics to confer a scientific flavour to so much bad science, has had a pernicious effect on some disbelief on statistical research. Would Lord Rutherford be alive today, it is almost certain that he would not condemn the use of statistics in research, as he did in the dawn of the 20th century. But he would indeed urge everyone to use statistics quantum satis, since to use bad data, too many data, and statistics to enquire on irrelevant questions, is a source of bad science, namely because with too many data we can establish statistical significance of irrelevant results. This is an important point that addicts of evidence based medicine should be aware of, since the meta analysis of two many data will inevitably establish senseless results.

  14. Reform in Statistical Education

    ERIC Educational Resources Information Center

    Huck, Schuyler W.

    2007-01-01

    Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…

  15. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  16. Applied Statistics with SPSS

    ERIC Educational Resources Information Center

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  17. Overhead Image Statistics

    SciTech Connect

    Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A

    2008-01-01

    Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.

  18. Calibrated Peer Review for Interpreting Linear Regression Parameters: Results from a Graduate Course

    ERIC Educational Resources Information Center

    Enders, Felicity B.; Jenkins, Sarah; Hoverman, Verna

    2010-01-01

    Biostatistics is traditionally a difficult subject for students to learn. While the mathematical aspects are challenging, it can also be demanding for students to learn the exact language to use to correctly interpret statistical results. In particular, correctly interpreting the parameters from linear regression is both a vital tool and a…

  19. Pointwise probability reinforcements for robust statistical inference.

    PubMed

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.

  20. Interpretation of FTIR spectra of polymers and Raman spectra of car paints by means of likelihood ratio approach supported by wavelet transform for reducing data dimensionality.

    PubMed

    Martyna, Agnieszka; Michalska, Aleksandra; Zadora, Grzegorz

    2015-05-01

    The problem of interpretation of common provenance of the samples within the infrared spectra database of polypropylene samples from car body parts and plastic containers as well as Raman spectra databases of blue solid and metallic automotive paints was under investigation. The research involved statistical tools such as likelihood ratio (LR) approach for expressing the evidential value of observed similarities and differences in the recorded spectra. Since the LR models can be easily proposed for databases described by a few variables, research focused on the problem of spectra dimensionality reduction characterised by more than a thousand variables. The objective of the studies was to combine the chemometric tools easily dealing with multidimensionality with an LR approach. The final variables used for LR models' construction were derived from the discrete wavelet transform (DWT) as a data dimensionality reduction technique supported by methods for variance analysis and corresponded with chemical information, i.e. typical absorption bands for polypropylene and peaks associated with pigments present in the car paints. Univariate and multivariate LR models were proposed, aiming at obtaining more information about the chemical structure of the samples. Their performance was controlled by estimating the levels of false positive and false negative answers and using the empirical cross entropy approach. The results for most of the LR models were satisfactory and enabled solving the stated comparison problems. The results prove that the variables generated from DWT preserve signal characteristic, being a sparse representation of the original signal by keeping its shape and relevant chemical information.

  1. Sign Language Interpreter Needs Assessment.

    ERIC Educational Resources Information Center

    Oakland Community Coll., Farmington, MI. Office of Institutional Planning and Analysis.

    In 1991, a study was conducted by Oakland Community College (OCC) in order to evaluate the need for a proposed Sign Language Interpreter program. OCC's study focused on validating and updating findings from a similar research project begun in fall 1989 by Macomb Community College (MCC) in Warren, Michigan. Federal and state legislation, data from…

  2. Design Document. EKG Interpretation Program.

    ERIC Educational Resources Information Center

    Webb, Sandra M.

    This teaching plan is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in acquainting students with the basic skills needed to perform electrocardiographic (ECG or EKG) interpretations. The first part of the teaching plan contains a statement of purpose; audience recommendations; a flow chart detailing…

  3. Interpreting Literature for Young Children.

    ERIC Educational Resources Information Center

    Post, Robert M.

    1983-01-01

    Looks at the performance of children's literature by college students with respect to the value of reading to young children, selecting and performing the literature, purposes of a unit or course in the oral interpretation of this genre of literature, and some practical performance considerations. (PD)

  4. Making Sense of Multiple Interpretations

    ERIC Educational Resources Information Center

    Dougherty, Jack

    2004-01-01

    Some teaching innovations arise from a combination of good intentions, last-minute planning, and incredible luck. In this article, the author discusses the different interpretations of the students on Constance Curry's 'Silver Rights' and David Cecelski's 'Along Freedom Road,' the two books he assigns to the class in the history of education…

  5. Failure to use an interpreter.

    PubMed

    Bird, Sara

    2010-04-01

    Case studies are based on actual medical negligence claims or medicolegal referrals; however certain facts have been omitted or changed by the author to ensure the anonymity of the parties involved. This article discusses a Medical Board complaint involving an allegation of failure to use an interpreter, resulting in the death of a patient, aged 35 years.

  6. Reference for radiographic film interpreters

    NASA Technical Reports Server (NTRS)

    Austin, D. L.

    1970-01-01

    Reference of X-ray film images provides examples of weld defects, film quality, stainless steel welded tubing, and acceptable weld conditions. A summary sheet details the discrepancies shown on the film strip. This reference aids in interpreting and evaluating radiographic film of weldments.

  7. Interpreting Shock Tube Ignition Data

    DTIC Science & Technology

    2003-10-01

    times only for high concentrations (of order 1% fuel or greater). The requirements of engine (IC, HCCI , CI and SI) modelers also present a different...Paper 03F-61 Interpreting Shock Tube Ignition Data D. F. Davidson and R. K. Hanson Mechanical Engineering ... Engineering Department Stanford University, Stanford CA 94305 Abstract Chemical kinetic modelers make extensive use of shock tube ignition data

  8. Smartberries: Interpreting Erdrich's Love Medicine

    ERIC Educational Resources Information Center

    Treuer, David

    2005-01-01

    The structure of "Love Medicines" interpreted by Hertha D. Sweet Wong who claims that the book's "multiple narrators confound conventional Western expectations of an autonomous protagonist, a dominant narrative voice, and a consistently chronological narrative". "Love Medicine" is a brilliant use of the Western literary tactics that create the…

  9. Conflicting Interpretations of Scientific Pedagogy

    ERIC Educational Resources Information Center

    Galamba, Arthur

    2016-01-01

    Not surprisingly historical studies have suggested that there is a distance between concepts of teaching methods, their interpretations and their actual use in the classroom. This issue, however, is not always pitched to the personal level in historical studies, which may provide an alternative insight on how teachers conceptualise and engage with…

  10. Studies in Interpretation. Volume II.

    ERIC Educational Resources Information Center

    Doyle, Esther M., Ed.; Floyd, Virginia Hastings, Ed.

    The purpose of this second book of 21 self-contained essays is the same as that of the first volume published in 1972: to bring together the scholarly theory and current research regarding oral interpretation. One third of the essays are centered on literature itself: prose fiction, poetry, and the drama. These essays discuss topics such as point…

  11. Interpreting Data: The Hybrid Mind

    ERIC Educational Resources Information Center

    Heisterkamp, Kimberly; Talanquer, Vicente

    2015-01-01

    The central goal of this study was to characterize major patterns of reasoning exhibited by college chemistry students when analyzing and interpreting chemical data. Using a case study approach, we investigated how a representative student used chemical models to explain patterns in the data based on structure-property relationships. Our results…

  12. Interpretive Reproduction in Children's Play

    ERIC Educational Resources Information Center

    Corsaro, William A.

    2012-01-01

    The author looks at children's play from the perspective of interpretive reproduction, emphasizing the way children create their own unique peer cultures, which he defines as a set of routines, artifacts, values, and concerns that children engage in with their playmates. The article focuses on two types of routines in the peer culture of preschool…

  13. Interpretation and the Aesthetic Dimension

    ERIC Educational Resources Information Center

    Mortensen, Charles O.

    1976-01-01

    The author, utilizing a synthesis of philosophic comments on aesthetics, provides a discourse on the aesthetic dimension and offers examples of how interpreters can nurture the innate sense of beauty in man. Poetic forms, such as haiku, are used to relate the aesthetic relationship between man and the environment. (BT)

  14. Directionality effects in simultaneous language interpreting: the case of sign language interpreters in The Netherlands.

    PubMed

    Van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of The Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives was assessed by 5 certified sign language interpreters who did not participate in the study. Two measures were used to assess interpreting quality: the propositional accuracy of the interpreters' interpretations and a subjective quality measure. The results showed that the interpreted narratives in the SLN-to-Dutch interpreting direction were of lower quality (on both measures) than the interpreted narratives in the Dutch-to-SLN and Dutch-to-SSD directions. Furthermore, interpreters who had begun acquiring SLN when they entered the interpreter training program performed as well in all 3 interpreting directions as interpreters who had acquired SLN from birth.

  15. Business conversion. Catholic Healthcare West ends formal ties to church and hopes as Dignity Health that it can more easily add non-Catholic hospitals.

    PubMed

    Selvam, Ashok

    2012-01-30

    Catholic Healthcare West is now rechristened Dignity Health. Freed from its formal ties with the Roman Catholic Church, it's seeking to expand east by more easily adding hospitals that may have previously been apprehensive about adopting Catholic ethical directives. "I would say our vision has not changed and neither has our mission as being a voice for the voiceless," says Lloyd Dean, left, the system's president and CEO.

  16. Statistics at a glance.

    PubMed

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.

  17. "Just Another Statistic"

    PubMed

    Machtay; Glatstein

    1998-01-01

    have shown overall survivals superior to age-matched controls). It is fallacious and illogical to compare nonrandomized series of observation to those of aggressive therapy. In addition to the above problem, the use of DSS introduces another potential issue which we will call the bias of cause-of-death-interpretation. All statistical endpoints (e.g., response rates, local-regional control, freedom from brain metastases), except OS, are known to depend heavily on the methods used to define the endpoint and are often subject to significant interobserver variability. There is no reason to believe that this problem does not occasionally occur with respect to defining a death as due to the index cancer or to intercurrent disease, even though this issue has been poorly studied. In many oncologic situations-for example, metastatic lung cancer-this form of bias does not exist. In some situations, such as head and neck cancer, this could be an intermediate problem (Was that lethal chest tumor a second primary or a metastasis?.Would the fatal aspiration pneumonia have occurred if he still had a tongue?.And what about Mr. B. described above?). In some situations, particularly relatively "good prognosis" neoplasms, this could be a substantial problem, particularly if the adjudication of whether or not a death is cancer-related is performed solely by researchers who have an "interest" in demonstrating a good DSS. What we are most concerned about with this form of bias relates to recent series on observation, such as in early prostate cancer. It is interesting to note that although only 10% of the "observed" patients die from prostate cancer, many develop distant metastases by 10 years (approximately 40% among patients with intermediate grade tumors). Thus, it is implied that many prostate cancer metastases are usually not of themselves lethal, which is a misconception to anyone experienced in taking care of prostate cancer patients. This is inconsistent with U.S. studies of

  18. Different effects of verapamil and low calcium on repetitive contractile activity of frog fatigue-resistant and easily-fatigued muscle fibres.

    PubMed

    Lipská, E; Radzyukevich, T

    1999-06-01

    The effects of low calcium and verapamil on contractility of two muscle fibre types (m. iliofibularis, Rana temporaria) upon different stimulation protocols were been compared. Verapamil (0.02 mmol/l) induced temporal excitation-contraction coupling failure during single tetanic stimulation and enhanced the decline of tetanic force during 30 s repetitive tetanic stimulation in both fatigue-resistant fibres and easily-fatigued fibres. In contrast to verapamil, low extracellular calcium (0.02 mmol/l) only enhanced the decline of tetanic force in fatigue-resistant during repetitive tetanic stimulation but had no effect on easily-fatigued fibres. The effect of verapamil on the decline of tetanic force in fatigue-resistant fibres was more profound in low calcium conditions. Both verapamil and low calcium eliminated twitch facilitation that appeared after prolonged contractile activity in fatigue-resistant fibres. 4mmol/l Ni+2, used as calcium channel antagonist, had effects similar to low calcium medium. It could be concluded that (i) extracellular Ca2+-requirements for excitation-contraction coupling are different in fatigue-resistant and easily-fatigued fibres; (ii) the effects of verapamil on force performance are not entirely dependent upon calcium channel blockade.

  19. Informal Statistics Help Desk

    NASA Technical Reports Server (NTRS)

    Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.

    2017-01-01

    Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.

  20. Commentary: statistics for biomarkers.

    PubMed

    Lovell, David P

    2012-05-01

    This short commentary discusses Biomarkers' requirements for the reporting of statistical analyses in submitted papers. It is expected that submitters will follow the general instructions of the journal, the more detailed guidance given by the International Committee of Medical Journal Editors, the specific guidelines developed by the EQUATOR network, and those of various specialist groups. Biomarkers expects that the study design and subsequent statistical analyses are clearly reported and that the data reported can be made available for independent assessment. The journal recognizes that there is continuing debate about different approaches to statistical science. Biomarkers appreciates that the field continues to develop rapidly and encourages the use of new methodologies.

  1. How implicit is visual statistical learning?

    PubMed

    Bertels, Julie; Franco, Ana; Destrebecqz, Arnaud

    2012-09-01

    In visual statistical learning, participants learn the statistical regularities present in a sequence of visual shapes. A recent study (Kim, Seitz, Feenstra, & Shams, 2009) suggests that visual statistical learning occurs implicitly, as it is not accompanied by conscious awareness of these regularities. However, that interpretation of the data depends on 2 unwarranted assumptions concerning the nature and sensitivity of the tasks used to measure learning. In a replication of this study, we used a 4-choice completion task as a direct measure of learning, in addition to an indirect measure consisting of a rapid serial visual presentation task. Moreover, binary confidence judgments were recorded after each completion trial. This way, we measured systematically the extent to which sequence knowledge was available to consciousness. Supporting the notion that the role of unconscious knowledge was overestimated in Kim et al.'s study, our results reveal that participants' performance cannot be exclusively accounted for by implicit knowledge.

  2. Interpreting Sky-Averaged 21-cm Measurements

    NASA Astrophysics Data System (ADS)

    Mirocha, Jordan

    2015-01-01

    Within the first ~billion years after the Big Bang, the intergalactic medium (IGM) underwent a remarkable transformation, from a uniform sea of cold neutral hydrogen gas to a fully ionized, metal-enriched plasma. Three milestones during this epoch of reionization -- the emergence of the first stars, black holes (BHs), and full-fledged galaxies -- are expected to manifest themselves as extrema in sky-averaged ("global") measurements of the redshifted 21-cm background. However, interpreting these measurements will be complicated by the presence of strong foregrounds and non-trivialities in the radiative transfer (RT) modeling required to make robust predictions.I have developed numerical models that efficiently solve the frequency-dependent radiative transfer equation, which has led to two advances in studies of the global 21-cm signal. First, frequency-dependent solutions facilitate studies of how the global 21-cm signal may be used to constrain the detailed spectral properties of the first stars, BHs, and galaxies, rather than just the timing of their formation. And second, the speed of these calculations allows one to search vast expanses of a currently unconstrained parameter space, while simultaneously characterizing the degeneracies between parameters of interest. I find principally that (1) physical properties of the IGM, such as its temperature and ionization state, can be constrained robustly from observations of the global 21-cm signal without invoking models for the astrophysical sources themselves, (2) translating IGM properties to galaxy properties is challenging, in large part due to frequency-dependent effects. For instance, evolution in the characteristic spectrum of accreting BHs can modify the 21-cm absorption signal at levels accessible to first generation instruments, but could easily be confused with evolution in the X-ray luminosity star-formation rate relation. Finally, (3) the independent constraints most likely to aide in the interpretation

  3. Playing at Statistical Mechanics

    ERIC Educational Resources Information Center

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  4. Hemophilia Data and Statistics

    MedlinePlus

    ... Hemophilia Women Healthcare Providers Partners Media Policy Makers Data & Statistics Language: English Español (Spanish) Recommend on Facebook ... at a very young age. Based on CDC data, the median age at diagnosis is 36 months ...

  5. Cooperative Learning in Statistics.

    ERIC Educational Resources Information Center

    Keeler, Carolyn M.; And Others

    1994-01-01

    Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)

  6. Statistics of the sagas

    NASA Astrophysics Data System (ADS)

    Richfield, Jon; bookfeller

    2016-07-01

    In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.

  7. Elements of Statistics

    NASA Astrophysics Data System (ADS)

    Grégoire, G.

    2016-05-01

    This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.

  8. Plague Maps and Statistics

    MedlinePlus

    ... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...

  9. Understanding Solar Flare Statistics

    NASA Astrophysics Data System (ADS)

    Wheatland, M. S.

    2005-12-01

    A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.

  10. Titanic: A Statistical Exploration.

    ERIC Educational Resources Information Center

    Takis, Sandra L.

    1999-01-01

    Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)

  11. Purposeful Statistical Investigations

    ERIC Educational Resources Information Center

    Day, Lorraine

    2014-01-01

    Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.

  12. Boosted Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Testa, Massimo

    2015-08-01

    Starting with the basic principles of Relativistic Quantum Mechanics, we give a rigorous, but completely elementary proof of the relation between fundamental observables of a statistical system, when measured within two inertial reference frames, related by a Lorentz transformation.

  13. How Statistics "Excel" Online.

    ERIC Educational Resources Information Center

    Chao, Faith; Davis, James

    2000-01-01

    Discusses the use of Microsoft Excel software and provides examples of its use in an online statistics course at Golden Gate University in the areas of randomness and probability, sampling distributions, confidence intervals, and regression analysis. (LRW)

  14. Which statistics should tropical biologists learn?

    PubMed

    Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián

    2011-09-01

    Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.

  15. [Significance of medical statistics in insurance medicine].

    PubMed

    Becher, J

    2001-03-01

    Knowledge of medical statistics is of great benefit to every insurance medical officer as they facilitate communication with actuaries, allow officers to make their own calculations and are the basis for correctly interpreting medical journals. Only about 20% of original work in medicine today is published without statistics or only with descriptive statistics--and this trend is falling. The reader of medical publications should be in a position to make a critical analysis of the methodology and content, since one cannot always rely on the conclusions drawn by the authors: statistical errors appear very frequently in medical publications. Due to the specific methodological features involved, the assessment of meta-analyses demands special attention. The number of published meta-analyses has risen 40-fold over the last ten years. Important examples for the practical use of statistical methods in insurance medicine include estimating extramortality from published survival analyses and evaluating diagnostic test results. The purpose of this article is to highlight statistical problems and issues of relevance to insurance medicine and to establish the bases for understanding them.

  16. Interpretation of fluorescence correlation spectra of biopolymer solutions.

    PubMed

    Phillies, George D J

    2016-05-01

    Fluorescence correlation spectroscopy (FCS) is regularly used to study diffusion in non-dilute "crowded" biopolymer solutions, including the interior of living cells. For fluorophores in dilute solution, the relationship between the FCS spectrum G(t) and the diffusion coefficient D is well-established. However, the dilute-solution relationship between G(t) and D has sometimes been used to interpret FCS spectra of fluorophores in non-dilute solutions. Unfortunately, the relationship used to interpret FCS spectra in dilute solutions relies on an assumption that is not always correct in non-dilute solutions. This paper obtains the correct form for interpreting FCS spectra of non-dilute solutions, writing G(t) in terms of the statistical properties of the fluorophore motions. Approaches for applying this form are discussed.

  17. Transportation Statistics Annual Report 1997

    SciTech Connect

    Fenn, M.

    1997-01-01

    This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these

  18. A new statistical tool for NOAA local climate studies

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.

    2011-12-01

    The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially

  19. Clinical Interpretation of Genomic Variations

    PubMed Central

    Sayitoğlu, Müge

    2016-01-01

    Novel high-throughput sequencing technologies generate large-scale genomic data and are used extensively for disease mapping of monogenic and/or complex disorders, personalized treatment, and pharmacogenomics. Next-generation sequencing is rapidly becoming routine tool for diagnosis and molecular monitoring of patients to evaluate therapeutic efficiency. The next-generation sequencing platforms generate huge amounts of genetic variation data and it remains a challenge to interpret the variations that are identified. Such data interpretation needs close collaboration among bioinformaticians, clinicians, and geneticists. There are several problems that must be addressed, such as the generation of new algorithms for mapping and annotation, harmonization of the terminology, correct use of nomenclature, reference genomes for different populations, rare disease variant databases, and clinical reports. PMID:27507302

  20. Clinical Interpretation of Genomic Variations.

    PubMed

    Sayitoğlu, Müge

    2016-09-05

    Novel high-throughput sequencing technologies generate large-scale genomic data and are used extensively for disease mapping of monogenic and/or complex disorders, personalized treatment, and pharmacogenomics. Next-generation sequencing is rapidly becoming routine tool for diagnosis and molecular monitoring of patients to evaluate therapeutic efficiency. The next-generation sequencing platforms generate huge amounts of genetic variation data and it remains a challenge to interpret the variations that are identified. Such data interpretation needs close collaboration among bioinformaticians, clinicians, and geneticists. There are several problems that must be addressed, such as the generation of new algorithms for mapping and annotation, harmonization of the terminology, correct use of nomenclature, reference genomes for different populations, rare disease variant databases, and clinical reports.

  1. Phonological Interpretation into Preordered Algebras

    NASA Astrophysics Data System (ADS)

    Kubota, Yusuke; Pollard, Carl

    We propose a novel architecture for categorial grammar that clarifies the relationship between semantically relevant combinatoric reasoning and semantically inert reasoning that only affects surface-oriented phonological form. To this end, we employ a level of structured phonology that mediates between syntax (abstract combinatorics) and phonology proper (strings). To notate structured phonologies, we employ a lambda calculus analogous to the φ-terms of [8]. However, unlike Oehrle's purely equational φ-calculus, our phonological calculus is inequational, in a way that is strongly analogous to the functional programming language LCF [10]. Like LCF, our phonological terms are interpreted into a Henkin frame of posets, with degree of definedness ('height' in the preorder that interprets the base type) corresponding to degree of pronounceability; only maximal elements are actual strings and therefore fully pronounceable. We illustrate with an analysis (also new) of some complex constituent-order phenomena in Japanese.

  2. Inuit interpretations of sleep paralysis.

    PubMed

    Law, Samuel; Kirmayer, Laurence J

    2005-03-01

    Traditional and contemporary Inuit concepts of sleep paralysis were investigated through interviews with elders and young people in Iqaluit, Baffin Island. Sleep paralysis was readily recognized by most respondents and termed uqumangirniq (in the Baffin region) or aqtuqsinniq (Kivalliq region). Traditional interpretations of uqumangirniq referred to a shamanistic cosmology in which the individual's soul was vulnerable during sleep and dreaming. Sleep paralysis could result from attack by shamans or malevolent spirits. Understanding the experience as a manifestation of supernatural power, beyond one's control, served to reinforce the experiential reality and presence of the spirit world. For contemporary youth, sleep paralysis was interpreted in terms of multiple frameworks that incorporated personal, medical, mystical, traditional/shamanistic, and Christian views, reflecting the dynamic social changes taking place in this region.

  3. Primer of statistics in dental research: part I.

    PubMed

    Shintani, Ayumi

    2014-01-01

    Statistics play essential roles in evidence-based dentistry (EBD) practice and research. It ranges widely from formulating scientific questions, designing studies, collecting and analyzing data to interpreting, reporting, and presenting study findings. Mastering statistical concepts appears to be an unreachable goal among many dental researchers in part due to statistical authorities' limitations of explaining statistical principles to health researchers without elaborating complex mathematical concepts. This series of 2 articles aim to introduce dental researchers to 9 essential topics in statistics to conduct EBD with intuitive examples. The part I of the series includes the first 5 topics (1) statistical graph, (2) how to deal with outliers, (3) p-value and confidence interval, (4) testing equivalence, and (5) multiplicity adjustment. Part II will follow to cover the remaining topics including (6) selecting the proper statistical tests, (7) repeated measures analysis, (8) epidemiological consideration for causal association, and (9) analysis of agreement.

  4. Understanding AOP through the Study of Interpreters

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    I return to the question of what distinguishes AOP languages by considering how the interpreters of AOP languages differ from conventional interpreters. Key elements for static transformation are seen to be redefinition of the set and lookup operators in the interpretation of the language. This analysis also yields a definition of crosscutting in terms of interlacing of interpreter actions.

  5. 32 CFR 1605.81 - Interpreters.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Interpreters. 1605.81 Section 1605.81 National... ORGANIZATION Interpreters § 1605.81 Interpreters. (a) The local board, district appeal board and the National Selective Service Appeal Board are authorized to use interpreters when necessary. (b) The following...

  6. 32 CFR 1605.81 - Interpreters.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Interpreters. 1605.81 Section 1605.81 National... ORGANIZATION Interpreters § 1605.81 Interpreters. (a) The local board, district appeal board and the National Selective Service Appeal Board are authorized to use interpreters when necessary. (b) The following...

  7. Interpreting Inexplicit Language during Courtroom Examination

    ERIC Educational Resources Information Center

    Lee, Jieun

    2009-01-01

    Court interpreters are required to provide accurate renditions of witnesses' utterances during courtroom examinations, but the accuracy of interpreting may be compromised for a number of reasons, among which is the effect on interpretation of the limited contextual information available to court interpreters. Based on the analysis of the discourse…

  8. 32 CFR 1605.81 - Interpreters.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Interpreters. 1605.81 Section 1605.81 National... ORGANIZATION Interpreters § 1605.81 Interpreters. (a) The local board, district appeal board and the National Selective Service Appeal Board are authorized to use interpreters when necessary. (b) The following...

  9. 32 CFR 1605.81 - Interpreters.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Interpreters. 1605.81 Section 1605.81 National... ORGANIZATION Interpreters § 1605.81 Interpreters. (a) The local board, district appeal board and the National Selective Service Appeal Board are authorized to use interpreters when necessary. (b) The following...

  10. 32 CFR 1605.81 - Interpreters.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Interpreters. 1605.81 Section 1605.81 National... ORGANIZATION Interpreters § 1605.81 Interpreters. (a) The local board, district appeal board and the National Selective Service Appeal Board are authorized to use interpreters when necessary. (b) The following...

  11. Educational Interpreting: Understanding the Rural Experience.

    ERIC Educational Resources Information Center

    Yarger, Carmel Collum

    2001-01-01

    A survey of 63 educational interpreters employed in two rural states, found only 10 interpreters had completed interpreter preparation programs, with 5 of these having no course work related to education. The mean score of 43 assessed using the Educational Interpreter Performance Assessment was 2.6, below the level of "coherent". (Contains…

  12. What Does It Mean to Teach "Interpretively"?

    ERIC Educational Resources Information Center

    Dodge, Jennifer; Holtzman, Richard; van Hulst, Merlijn; Yanow, Dvora

    2016-01-01

    The "interpretive turn" has gained traction as a research approach in recent decades in the empirical social sciences. While the contributions of interpretive research and interpretive research methods are clear, we wonder: Does an interpretive perspective lend itself to--or even demand--a particular style of teaching? This question was…

  13. Interpreting past religious discrimination today.

    PubMed

    Schumm, Walter R

    2003-10-01

    Much of modern western law now presupposes opposition to discrimination based on race, religion, sex, national origin, and other factors. However, ancient religious Scriptures may have sanctioned certain types of discrimination. Whether those who are inclined to accept literal interpretations of their Scriptures will condone certain forms of discrimination could be evaluated to contrast the effects of modernization versus religious indoctrination on various kinds of prejudice.

  14. College Students' Interpretation of Research Reports on Group Differences: The Tall-Tale Effect

    ERIC Educational Resources Information Center

    Hogan, Thomas P.; Zaboski, Brian A.; Perry, Tiffany R.

    2015-01-01

    How does the student untrained in advanced statistics interpret results of research that reports a group difference? In two studies, statistically untrained college students were presented with abstracts or professional associations' reports and asked for estimates of scores obtained by the original participants in the studies. These estimates…

  15. Calculation and Interpretation of XANES

    NASA Astrophysics Data System (ADS)

    Ravel, B.; Rehr, J. J.

    1997-03-01

    A real space multiple-scattering (MS) approach for ab initio calculations and for the interpretation of x-ray absorption near edge structure (XANES) is presented. The method is based on full-MS calculations of the electron density matrix ρ(E). Our approach uses the exact Rehr-Albers [Phys. Rev. B, 41, 8139, (1990)] separable representation of the free propagator G together with atomic scattering t-matrices from uc(FEFF7) [Phys. Rev. B52, 2995 (1995)]. This method yields a parallel treatment both of XANES and local electronic structure, including local densities of states (LDOS) and charge transfer. With this method XANES for large clusters can be calculated efficiently. A scattering theoretic interpretation is presented using the separation of both XANES and LDOS into central site and scattering parts, i.e., μ(E)=μ_c(E)[1+\\chi(E)] and ρ(E)=ρ_c(E)[1+\\chi(E)], where \\chi(E) is the XAFS function and both μc and ρc are smooth backgrounds. Charge transfer is interpreted in terms of the scattering part \\chi, and hence is related to features in XANES. Calculations for several materials are presented and compared with LMTO band-structure calculations and with experiment.

  16. Consistent interpretations of quantum mechanics

    SciTech Connect

    Omnes, R. )

    1992-04-01

    Within the last decade, significant progress has been made towards a consistent and complete reformulation of the Copenhagen interpretation (an interpretation consisting in a formulation of the experimental aspects of physics in terms of the basic formalism; it is consistent if free from internal contradiction and complete if it provides precise predictions for all experiments). The main steps involved decoherence (the transition from linear superpositions of macroscopic states to a mixing), Griffiths histories describing the evolution of quantum properties, a convenient logical structure for dealing with histories, and also some progress in semiclassical physics, which was made possible by new methods. The main outcome is a theory of phenomena, viz., the classically meaningful properties of a macroscopic system. It shows in particular how and when determinism is valid. This theory can be used to give a deductive form to measurement theory, which now covers some cases that were initially devised as counterexamples against the Copenhagen interpretation. These theories are described, together with their applications to some key experiments and some of their consequences concerning epistemology.

  17. The Sport Students’ Ability of Literacy and Statistical Reasoning

    NASA Astrophysics Data System (ADS)

    Hidayah, N.

    2017-03-01

    The ability of literacy and statistical reasoning is very important for the students of sport education college due to the materials of statistical learning can be taken from their many activities such as sport competition, the result of test and measurement, predicting achievement based on training, finding connection among variables, and others. This research tries to describe the sport education college students’ ability of literacy and statistical reasoning related to the identification of data type, probability, table interpretation, description and explanation by using bar or pie graphic, explanation of variability, interpretation, the calculation and explanation of mean, median, and mode through an instrument. This instrument is tested to 50 college students majoring in sport resulting only 26% of all students have the ability above 30% while others still below 30%. Observing from all subjects; 56% of students have the ability of identification data classification, 49% of students have the ability to read, display and interpret table through graphic, 27% students have the ability in probability, 33% students have the ability to describe variability, and 16.32% students have the ability to read, count and describe mean, median and mode. The result of this research shows that the sport students’ ability of literacy and statistical reasoning has not been adequate and students’ statistical study has not reached comprehending concept, literary ability trining and statistical rasoning, so it is critical to increase the sport students’ ability of literacy and statistical reasoning

  18. Statistical Physics of Fracture

    SciTech Connect

    Alava, Mikko; Nukala, Phani K; Zapperi, Stefano

    2006-05-01

    Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.

  19. Cosmic inflation and big bang interpreted as explosions

    NASA Astrophysics Data System (ADS)

    Rebhan, E.

    2012-12-01

    It has become common understanding that the recession of galaxies and the corresponding redshift of light received from them can only be explained by an expansion of the space between them and us. In this paper, for the presently favored case of a universe without spatial curvature, it is shown that this interpretation is restricted to comoving coordinates. It is proven by construction that within the framework of general relativity other coordinates exist in relation to which these phenomena can be explained by a motion of the cosmic substrate across space, caused by an explosionlike big bang or by inflation preceding an almost big bang. At the place of an observer, this motion occurs without any spatial expansion. It is shown that in these “explosion coordinates” the usual redshift comes about by a Doppler shift and a subsequent gravitational shift. Making use of this interpretation, it can easily be understood why in comoving coordinates light rays of short spatial extension expand and thus constitute an exemption from the rule that small objects up to the size of the solar system or even galaxies do not participate in the expansion of the universe. It is also discussed how the two interpretations can be reconciled with each other.

  20. Discrepancies between film and digital mammography interpretations

    NASA Astrophysics Data System (ADS)

    Malhotra, Poonam; Kallergi, Maria; Alexander, Dominik; Berman, Claudia G.; Gardner, Mary; Hersh, Marla R.; Hooper, Lisa; Kim, Jihai J.; Venugopal, Priya

    2002-04-01

    The purpose of this study was to evaluate the frequency and reasons of disagreement between film and full-field digital mammography (FFDM) interpretations observed in a prospective clinical trial performed with the GE Senographe 2000D system. The data from 643 mammography examinations comprising both digital and film mammograms were analyzed for this purpose. Reports indicated that 455 findings were identified on the digital softcopy reading and 457 findings on the standard film mammography with 408 discrepancies. Findings with discrepancies were matched and analyzed. A reason was identified and a relative conspicuity score of 0 to 10 was assigned to each finding at the time of resolution; 0 corresponded to a finding highly conspicuous on digital, 10 to a finding highly conspicuous on film, and 5 denoted equal visibility on both. After review, agreement was established between the two modalities in 73.3% of the findings; 13.5% of findings were seen better on digital and 13.2% of the findings were seen better on film. Approximately 63% of the discrepancies occurred due to variability in the reporting style of the radiologists and/or unavailability of prior films for comparison. Three cancer cases were identified in this study; two were seen on both modalities and one only on film. In conclusion, no statistically significant differences were observed between digital and film mammography, a result that despite the small size of our dataset is in agreement with previous reports. Inter-observer variability, display differences, and presentation disagreements are the main reasons for interpretation differences that are primarily identified in the classification and BIRADS assignment.

  1. Suite versus composite statistics

    USGS Publications Warehouse

    Balsillie, J.H.; Tanner, W.F.

    1999-01-01

    Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.

  2. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    SciTech Connect

    Udey, Ruth Norma

    2013-01-01

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  3. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    ERIC Educational Resources Information Center

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  4. The Power of Statistical Tests for Moderators in Meta-Analysis

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Pigott, Therese D.

    2004-01-01

    Calculation of the statistical power of statistical tests is important in planning and interpreting the results of research studies, including meta-analyses. It is particularly important in moderator analyses in meta-analysis, which are often used as sensitivity analyses to rule out moderator effects but also may have low statistical power. This…

  5. Statistical Controversies in Reporting of Clinical Trials: Part 2 of a 4-Part Series on Statistics for Clinical Trials.

    PubMed

    Pocock, Stuart J; McMurray, John J V; Collier, Tim J

    2015-12-15

    This paper tackles several statistical controversies that are commonly faced when reporting a major clinical trial. Topics covered include: multiplicity of data, interpreting secondary endpoints and composite endpoints, the value of covariate adjustment, the traumas of subgroup analysis, assessing individual benefits and risks, alternatives to analysis by intention to treat, interpreting surprise findings (good and bad), and the overall quality of clinical trial reports. All is put in the context of topical cardiology trial examples and is geared to help trialists steer a wise course in their statistical reporting, thereby giving readers a balanced account of trial findings.

  6. FIR statistics of paired galaxies

    NASA Technical Reports Server (NTRS)

    Sulentic, Jack W.

    1990-01-01

    Much progress has been made in understanding the effects of interaction on galaxies (see reviews in this volume by Heckman and Kennicutt). Evidence for enhanced emission from galaxies in pairs first emerged in the radio (Sulentic 1976) and optical (Larson and Tinsley 1978) domains. Results in the far infrared (FIR) lagged behind until the advent of the Infrared Astronomy Satellite (IRAS). The last five years have seen numerous FIR studies of optical and IR selected samples of interacting galaxies (e.g., Cutri and McAlary 1985; Joseph and Wright 1985; Kennicutt et al. 1987; Haynes and Herter 1988). Despite all of this work, there are still contradictory ideas about the level and, even, the reality of an FIR enhancement in interacting galaxies. Much of the confusion originates in differences between the galaxy samples that were studied (i.e., optical morphology and redshift coverage). Here, the authors report on a study of the FIR detection properties for a large sample of interacting galaxies and a matching control sample. They focus on the distance independent detection fraction (DF) statistics of the sample. The results prove useful in interpreting the previously published work. A clarification of the phenomenology provides valuable clues about the physics of the FIR enhancement in galaxies.

  7. Statistical model with a standard Gamma distribution.

    PubMed

    Patriarca, Marco; Chakraborti, Anirban; Kaski, Kimmo

    2004-01-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter lambda. We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity lambda. Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T(lambda), where particles exchange energy in a space with an effective dimension D(lambda).

  8. Combining natural background levels (NBLs) assessment with indicator kriging analysis to improve groundwater quality data interpretation and management.

    PubMed

    Ducci, Daniela; de Melo, M Teresa Condesso; Preziosi, Elisabetta; Sellerino, Mariangela; Parrone, Daniele; Ribeiro, Luis

    2016-11-01

    The natural background level (NBL) concept is revisited and combined with indicator kriging method to analyze the spatial distribution of groundwater quality within a groundwater body (GWB). The aim is to provide a methodology to easily identify areas with the same probability of exceeding a given threshold (which may be a groundwater quality criteria, standards, or recommended limits for selected properties and constituents). Three case studies with different hydrogeological settings and located in two countries (Portugal and Italy) are used to derive NBL using the preselection method and validate the proposed methodology illustrating its main advantages over conventional statistical water quality analysis. Indicator kriging analysis was used to create probability maps of the three potential groundwater contaminants. The results clearly indicate the areas within a groundwater body that are potentially contaminated because the concentrations exceed the drinking water standards or even the local NBL, and cannot be justified by geogenic origin. The combined methodology developed facilitates the management of groundwater quality because it allows for the spatial interpretation of NBL values.

  9. Statistical origin of gravity

    SciTech Connect

    Banerjee, Rabin; Majhi, Bibhas Ranjan

    2010-06-15

    Starting from the definition of entropy used in statistical mechanics we show that it is proportional to the gravity action. For a stationary black hole this entropy is expressed as S=E/2T, where T is the Hawking temperature and E is shown to be the Komar energy. This relation is also compatible with the generalized Smarr formula for mass.

  10. Statistical Reasoning over Lunch

    ERIC Educational Resources Information Center

    Selmer, Sarah J.; Bolyard, Johnna J.; Rye, James A.

    2011-01-01

    Students in the 21st century are exposed daily to a staggering amount of numerically infused media. In this era of abundant numeric data, students must be able to engage in sound statistical reasoning when making life decisions after exposure to varied information. The context of nutrition can be used to engage upper elementary and middle school…

  11. Learning Statistical Concepts

    ERIC Educational Resources Information Center

    Akram, Muhammad; Siddiqui, Asim Jamal; Yasmeen, Farah

    2004-01-01

    In order to learn the concept of statistical techniques one needs to run real experiments that generate reliable data. In practice, the data from some well-defined process or system is very costly and time consuming. It is difficult to run real experiments during the teaching period in the university. To overcome these difficulties, statisticians…

  12. Analogies for Understanding Statistics

    ERIC Educational Resources Information Center

    Hocquette, Jean-Francois

    2004-01-01

    This article describes a simple way to explain the limitations of statistics to scientists and students to avoid the publication of misleading conclusions. Biologists examine their results extremely critically and carefully choose the appropriate analytic methods depending on their scientific objectives. However, no such close attention is usually…

  13. Statistical Significance Testing.

    ERIC Educational Resources Information Center

    McLean, James E., Ed.; Kaufman, Alan S., Ed.

    1998-01-01

    The controversy about the use or misuse of statistical significance testing has become the major methodological issue in educational research. This special issue contains three articles that explore the controversy, three commentaries on these articles, an overall response, and three rejoinders by the first three authors. They are: (1)…

  14. Structurally Sound Statistics Instruction

    ERIC Educational Resources Information Center

    Casey, Stephanie A.; Bostic, Jonathan D.

    2016-01-01

    The Common Core's Standards for Mathematical Practice (SMP) call for all K-grade 12 students to develop expertise in the processes and proficiencies of doing mathematics. However, the Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010) as a whole addresses students' learning of not only mathematics but also statistics. This situation…

  15. General Aviation Avionics Statistics.

    DTIC Science & Technology

    1980-12-01

    No. 2. Government Accession No. 3. Recipient’s Catalog No. 5" FAA-MS-80-7* a and. SubtitleDecember 1&80 "GENERAL AVIATION AVIONICS STATISTICS 0 6...Altimeter 8. Fuel gage 3. Compass 9. Landing gear 4. Tachometer 10. Belts 5. Oil temperature 11. Special equipment for 6. Emergency locator over water

  16. NACME Statistical Report 1986.

    ERIC Educational Resources Information Center

    Miranda, Luis A.; Ruiz, Esther

    This statistical report summarizes data on enrollment and graduation of minority students in engineering degree programs from 1974 to 1985. First, an introduction identifies major trends and briefly describes the Incentive Grants Program (IGP), the nation's largest privately supported source of scholarship funds available to minority engineering…

  17. Probability and Statistics.

    ERIC Educational Resources Information Center

    Barnes, Bernis, Ed.; And Others

    This teacher's guide to probability and statistics contains three major sections. The first section on elementary combinatorial principles includes activities, student problems, and suggested teaching procedures for the multiplication principle, permutations, and combinations. Section two develops an intuitive approach to probability through…

  18. Selected Manpower Statistics.

    ERIC Educational Resources Information Center

    Office of the Assistant Secretary of Defense -- Comptroller (DOD), Washington, DC.

    This document contains summaries of basic manpower statistical data for the Department of Defense, with the Army, Navy, Marine Corps, and Air Force totals shown separately and collectively. Included are figures for active duty military personnel, civilian personnel, reserve components, and retired military personnel. Some of the data show…

  19. Statistics of mass production

    NASA Astrophysics Data System (ADS)

    Williams, R. L.; Gateley, Wilson Y.

    1993-05-01

    This paper summarizes the statistical quality control methods and procedures that can be employed in mass producing electronic parts (integrated circuits, buffers, capacitors, connectors) to reduce variability and ensure performance to specified radiation, current, voltage, temperature, shock, and vibration levels. Producing such quality parts reduces uncertainties in performance and will aid materially in validating the survivability of components, subsystems, and systems to specified threats.

  20. Statistics for Learning Genetics

    ERIC Educational Resources Information Center

    Charles, Abigail Sheena

    2012-01-01

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…

  1. Education Statistics Quarterly, 2003.

    ERIC Educational Resources Information Center

    Marenus, Barbara; Burns, Shelley; Fowler, William; Greene, Wilma; Knepper, Paula; Kolstad, Andrew; McMillen Seastrom, Marilyn; Scott, Leslie

    2003-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  2. Whither Statistics Education Research?

    ERIC Educational Resources Information Center

    Watson, Jane

    2016-01-01

    This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…

  3. Quartiles in Elementary Statistics

    ERIC Educational Resources Information Center

    Langford, Eric

    2006-01-01

    The calculation of the upper and lower quartile values of a data set in an elementary statistics course is done in at least a dozen different ways, depending on the text or computer/calculator package being used (such as SAS, JMP, MINITAB, "Excel," and the TI-83 Plus). In this paper, we examine the various methods and offer a suggestion for a new…

  4. Mental Illness Statistics

    MedlinePlus

    ... of benign genes ID’s ASD suspects More Additional Mental Health Information from NIMH Medications Statistics Clinical Trials Coping ... Finder Publicaciones en Español The National Institute of Mental Health (NIMH) is part of the National Institutes of ...

  5. Statistical Energy Analysis Program

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.

    1985-01-01

    Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.

  6. Library Research and Statistics.

    ERIC Educational Resources Information Center

    Lynch, Mary Jo; St. Lifer, Evan; Halstead, Kent; Fox, Bette-Lee; Miller, Marilyn L.; Shontz, Marilyn L.

    2001-01-01

    These nine articles discuss research and statistics on libraries and librarianship, including libraries in the United States, Canada, and Mexico; acquisition expenditures in public, academic, special, and government libraries; price indexes; state rankings of public library data; library buildings; expenditures in school library media centers; and…

  7. Directionality Effects in Simultaneous Language Interpreting: The Case of Sign Language Interpreters in the Netherlands

    ERIC Educational Resources Information Center

    van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of the Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives…

  8. Statistics for Learning Genetics

    NASA Astrophysics Data System (ADS)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  9. Functional programming interpreter. M. S. thesis

    SciTech Connect

    Robison, A.D.

    1987-03-01

    Functional Programming (FP) sup BAC87 is an alternative to conventional imperative programming languages. This thesis describes an FP interpreter implementation. Superficially, FP appears to be a simple, but very inefficient language. Its simplicity, however, allows it to be interpreted quickly. Much of the inefficiency can be removed by simple interpreter techniques. This thesis describes the Illinois Functional Programming (IFP) interpreter, an interactive functional programming implementation which runs under both MS-DOS and UNIX. The IFP interpreter allows functions to be created, executed, and debugged in an environment very similar to UNIX. IFP's speed is competitive with other interpreted languages such as BASIC.

  10. Weighted order statistic classifiers with large rank-order margin.

    SciTech Connect

    Porter, R. B.; Hush, D. R.; Theiler, J. P.; Gokhale, M.

    2003-01-01

    We describe how Stack Filters and Weighted Order Statistic function classes can be used for classification problems. This leads to a new design criteria for linear classifiers when inputs are binary-valued and weights are positive . We present a rank-based measure of margin that can be directly optimized as a standard linear program and investigate its effect on generalization error with experiment. Our approach can robustly combine large numbers of base hypothesis and easily implement known priors through regularization.

  11. Proteny: discovering and visualizing statistically significant syntenic clusters at the proteome level

    PubMed Central

    Gehrmann, Thies; Reinders, Marcel J.T.

    2015-01-01

    Background: With more and more genomes being sequenced, detecting synteny between genomes becomes more and more important. However, for microorganisms the genomic divergence quickly becomes large, resulting in different codon usage and shuffling of gene order and gene elements such as exons. Results: We present Proteny, a methodology to detect synteny between diverged genomes. It operates on the amino acid sequence level to be insensitive to codon usage adaptations and clusters groups of exons disregarding order to handle diversity in genomic ordering between genomes. Furthermore, Proteny assigns significance levels to the syntenic clusters such that they can be selected on statistical grounds. Finally, Proteny provides novel ways to visualize results at different scales, facilitating the exploration and interpretation of syntenic regions. We test the performance of Proteny on a standard ground truth dataset, and we illustrate the use of Proteny on two closely related genomes (two different strains of Aspergillus niger) and on two distant genomes (two species of Basidiomycota). In comparison to other tools, we find that Proteny finds clusters with more true homologies in fewer clusters that contain more genes, i.e. Proteny is able to identify a more consistent synteny. Further, we show how genome rearrangements, assembly errors, gene duplications and the conservation of specific genes can be easily studied with Proteny. Availability and implementation: Proteny is freely available at the Delft Bioinformatics Lab website http://bioinformatics.tudelft.nl/dbl/software. Contact: t.gehrmann@tudelft.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26116928

  12. Biostratigraphy: Interpretations of Oppel's zones

    NASA Astrophysics Data System (ADS)

    Scott, G. H.

    2013-11-01

    Zones like those of Oppel and Hedberg's Oppel-Zone are commonly interpreted as rock units delimited temporally. A more restricted view is that they are rock units empirically defined by bioevents that occur in the same order in all sections. Methods used by Oppel and definitions proposed by Hedberg are reviewed to assess their adequacy for definition of biostratigraphic units and their ability to support temporal inferences. Although they are usually interpreted as chronostratigraphic units, Oppel defined his zones in stratigraphic space, without temporal reference. In contrast, Hedberg required that bioevents for his Oppel-Zone should be approximately isochronous across their distribution but provided no operational way to identify such bioevents. Neither author clearly indicated how boundaries should be defined. Recourse to a principle of biosynchroneity to support inferences that stratigraphically ordered bioevents are temporal markers conflicts with knowledge of the biogeographies of modern taxa. Evolutionary theory explains why some bioevents occur in the same stratigraphic order but does not support the inference that they are isochronous events. Since its inception biostratigraphy has focused on ordered classifications, like those of Oppel. Stratigraphic codes should allow for a complementary category of biofacies zones that reflect depositional environments and are not constrained to occur in a particular order.

  13. Conflicting Interpretations of Scientific Pedagogy

    NASA Astrophysics Data System (ADS)

    Galamba, Arthur

    2016-05-01

    Not surprisingly historical studies have suggested that there is a distance between concepts of teaching methods, their interpretations and their actual use in the classroom. This issue, however, is not always pitched to the personal level in historical studies, which may provide an alternative insight on how teachers conceptualise and engage with concepts of teaching methods. This article provides a case study on this level of conceptualisation by telling the story of Rómulo de Carvalho, an educator from mid-twentieth century Portugal, who for over 40 years engaged with the heuristic and Socratic methods. The overall argument is that concepts of teaching methods are open to different interpretations and are conceptualised within the melting pot of external social pressures and personal teaching preferences. The practice and thoughts of Carvalho about teaching methods are scrutinised to unveil his conflicting stances: Carvalho was a man able to question the tenets of heurism, but who publicly praised the heurism-like "discovery learning" method years later. The first part of the article contextualises the arrival of heurism in Portugal and how Carvalho attacked its philosophical tenets. In the second part, it dwells on his conflicting positions in relation to pupil-centred approaches. The article concludes with an appreciation of the embedded conflicting nature of the appropriation of concepts of teaching methods, and of Carvalho's contribution to the development of the philosophy of practical work in school science.

  14. Smart Interpretation - Application of Machine Learning in Geological Interpretation of AEM Data

    NASA Astrophysics Data System (ADS)

    Bach, T.; Gulbrandsen, M. L.; Jacobsen, R.; Pallesen, T. M.; Jørgensen, F.; Høyer, A. S.; Hansen, T. M.

    2015-12-01

    When using airborne geophysical measurements in e.g. groundwater mapping, an overwhelming amount of data is collected. Increasingly larger survey areas, denser data collection and limited resources, combines to an increasing problem of building geological models that use all the available data in a manner that is consistent with the geologists knowledge about the geology of the survey area. In the ERGO project, funded by The Danish National Advanced Technology Foundation, we address this problem, by developing new, usable tools, enabling the geologist utilize her geological knowledge directly in the interpretation of the AEM data, and thereby handle the large amount of data, In the project we have developed the mathematical basis for capturing geological expertise in a statistical model. Based on this, we have implemented new algorithms that have been operationalized and embedded in user friendly software. In this software, the machine learning algorithm, Smart Interpretation, enables the geologist to use the system as an assistant in the geological modelling process. As the software 'learns' the geology from the geologist, the system suggest new modelling features in the data. In this presentation we demonstrate the application of the results from the ERGO project, including the proposed modelling workflow utilized on a variety of data examples.

  15. Sign language vocabulary development practices and internet use among educational interpreters.

    PubMed

    Storey, Brian C; Jamieson, Janet R

    2004-01-01

    Sign language interpreters working in schools often face isolation in terms of their sign language vocabulary development opportunities. The purposes of this study were to determine the key demographic characteristics of educational interpreters in British Columbia, to identify the resources they use to learn new vocabulary, and to shed light on their Internet use and access levels, with a view to exploring the viability of this resource as a tool for vocabulary development for interpreters working in educational settings. Key demographics associated with interpreters' access to time and materials in advance of a lesson were job title and graduation from an interpreter training program. Interpreters with job titles that reflected their status as interpreters had more preparatory time each week than interpreters who had job titles focused on their roles as educational assistants. Interpreters overwhelmingly expressed the need for continuing professional development with respect to vocabulary development. In terms of the resources currently used, human resources (colleagues, deaf adults) were used significantly more often than nonhuman (books, videotapes, Internet). The resource use results showed that convenience was more important than quality. Books were used more often than videotapes, CD-ROMs, and the Internet, although the latter three had higher percentages of very satisfied users than did books. The design and content of online vocabulary resources and limited interpreter preparation time were identified as current issues keeping the Internet from reaching its potential as an easily accessible visual resource. Recommendations aimed at enhancing the viability of the Internet as a vocabulary development tool for educational interpreters are discussed.

  16. Programmable Applications: Interpreter Meets Interface

    DTIC Science & Technology

    1991-10-01

    beau- tifully packaged paint programs, music programs, statistics programs, CAD systems, games, databases, word processors. Many of these programs are...applications software, there is often a muted note of frustration. Somehow it seems that the soft- ware never does quite enough. The competitor’s music ... music -composition program advertises that it allows crescendi and decrescendi to be notated directly on the screen: the composer selects one bar as

  17. Verbal framing of statistical evidence drives children's preference inferences.

    PubMed

    Garvin, Laura E; Woodward, Amanda L

    2015-05-01

    Although research has shown that statistical information can support children's inferences about specific psychological causes of others' behavior, previous work leaves open the question of how children interpret statistical information in more ambiguous situations. The current studies investigated the effect of specific verbal framing information on children's ability to infer mental states from statistical regularities in behavior. We found that preschool children inferred others' preferences from their statistically non-random choices only when they were provided with verbal information placing the person's behavior in a specifically preference-related context, not when the behavior was presented in a non-mentalistic action context or an intentional choice context. Furthermore, verbal framing information showed some evidence of supporting children's mental state inferences even from more ambiguous statistical data. These results highlight the role that specific, relevant framing information can play in supporting children's ability to derive novel insights from statistical information.

  18. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density

  19. The Statistical Fermi Paradox

    NASA Astrophysics Data System (ADS)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  20. Statistical aspects of food safety sampling.

    PubMed

    Jongenburger, I; den Besten, H M W; Zwietering, M H

    2015-01-01

    In food safety management, sampling is an important tool for verifying control. Sampling by nature is a stochastic process. However, uncertainty regarding results is made even greater by the uneven distribution of microorganisms in a batch of food. This article reviews statistical aspects of sampling and describes the impact of distributions on the sampling results. Five different batch contamination scenarios are illustrated: a homogeneous batch, a heterogeneous batch with high- or low-level contamination, and a batch with localized high- or low-level contamination. These batch contamination scenarios showed that sampling results have to be interpreted carefully, especially when heterogeneous and localized contamination in food products is expected.

  1. Statistical foundations of liquid-crystal theory

    PubMed Central

    Seguin, Brian; Fried, Eliot

    2013-01-01

    Working on a state space determined by considering a discrete system of rigid rods, we use nonequilibrium statistical mechanics to derive macroscopic balance laws for liquid crystals. A probability function that satisfies the Liouville equation serves as the starting point for deriving each macroscopic balance. The terms appearing in the derived balances are interpreted as expected values and explicit formulas for these terms are obtained. Among the list of derived balances appear two, the tensor moment of inertia balance and the mesofluctuation balance, that are not standard in previously proposed macroscopic theories for liquid crystals but which have precedents in other theories for structured media. PMID:23554513

  2. The Extended Statistical Analysis of Toxicity Tests Using Standardised Effect Sizes (SESs): A Comparison of Nine Published Papers

    PubMed Central

    Festing, Michael F. W.

    2014-01-01

    The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a “statistically significant” effect is real or a false positive (type I error) due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs) a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A “bootstrap” test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL) was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated. PMID:25426843

  3. Biostratinomic utility of Archimedes in environmental interpretation

    SciTech Connect

    Wulff, J.I. )

    1990-04-01

    Biostratinomic information from the bryozoan Archimedes can be used to infer paleocurrent senses when other more traditional sedimentary structures are lacking. As with other elongate particles, Archimedes zooaria become oriented in the current and, upon settling, preserve a sense of the flow direction. Orientations and lengths were measured on over 200 individuals from bedding plane exposures in the Upper Mississippian Union Limestone (Greenbrier Group) of West Virginia. These were separated into long and short populations and plotted on rose diagrams. The results show that long and short segments become preferentially oriented in the current and the bimodally distributed long segments can be used to infer the current sense. The current sense is defined by the line which bisects the obtuse angle created by the two maxima in the rose diagram for long segments. Statistical evaluation of the long and short populations indicate they are significant to the 99.9 percent level. Elongate fossils such as Archimedes can be used in paleocurrent evaluations and can add more detail to the interpretation of paleodepositional conditions.

  4. The wetland continuum: a conceptual framework for interpreting biological studies

    USGS Publications Warehouse

    Euliss, N.H.; LaBaugh, J.W.; Fredrickson, L.H.; Mushet, D.M.; Swanson, G.A.; Winter, T.C.; Rosenberry, D.O.; Nelson, R.D.

    2004-01-01

    We describe a conceptual model, the wetland continuum, which allows wetland managers, scientists, and ecologists to consider simultaneously the influence of climate and hydrologic setting on wetland biological communities. Although multidimensional, the wetland continuum is most easily represented as a two-dimensional gradient, with ground water and atmospheric water constituting the horizontal and vertical axis, respectively. By locating the position of a wetland on both axes of the continuum, the potential biological expression of the wetland can be predicted at any point in time. The model provides a framework useful in the organization and interpretation of biological data from wetlands by incorporating the dynamic changes these systems undergo as a result of normal climatic variation rather than placing them into static categories common to many wetland classification systems. While we developed this model from the literature available for depressional wetlands in the prairie pothole region of North America, we believe the concept has application to wetlands in many other geographic locations.

  5. Geographical features of the distribution and renewal of easily decomposable organic matter in virgin and arable zonal soils of European Russia

    NASA Astrophysics Data System (ADS)

    Borisov, B. A.; Ganzhara, N. F.

    2008-09-01

    A decrease in the depth of organic surface horizons (forest litters and steppe mats), the reserves of organic matter in them, and an increase in their renewal rate were noted for virgin and fallow soils when going from the southern taiga to the dry steppe zone. Zonal changes in the content and reserve of easily decomposable soil organic matter showed a similar tendency: these parameters regularly decreased from soddy-podzolic soils of the southern taiga to chestnut and light chestnut soils of the dry steppe. An exception from this series is provided by fallow chernozems of the steppe zone noted for the lowest content and reserve of labile organic matter in the series of soils studied. Similar, although less pronounced, tendencies were observed for the arable soils.

  6. MS1, MS2, and SQT-three unified, compact, and easily parsed file formats for the storage of shotgun proteomic spectra and identifications.

    PubMed

    McDonald, W Hayes; Tabb, David L; Sadygov, Rovshan G; MacCoss, Michael J; Venable, John; Graumann, Johannes; Johnson, Jeff R; Cociorva, Daniel; Yates, John R

    2004-01-01

    As the speed with which proteomic labs generate data increases along with the scale of projects they are undertaking, the resulting data storage and data processing problems will continue to challenge computational resources. This is especially true for shotgun proteomic techniques that can generate tens of thousands of spectra per instrument each day. One design factor leading to many of these problems is caused by storing spectra and the database identifications for a given spectrum as individual files. While these problems can be addressed by storing all of the spectra and search results in large relational databases, the infrastructure to implement such a strategy can be beyond the means of academic labs. We report here a series of unified text file formats for storing spectral data (MS1 and MS2) and search results (SQT) that are compact, easily parsed by both machine and humans, and yet flexible enough to be coupled with new algorithms and data-mining strategies.

  7. Data interpretation in breath biomarker research: pitfalls and directions.

    PubMed

    Miekisch, Wolfram; Herbig, Jens; Schubert, Jochen K

    2012-09-01

    Most--if not all--potential diagnostic applications in breath research involve different marker concentrations rather than unique breath markers which only occur in the diseased state. Hence, data interpretation is a crucial step in breath analysis. To avoid artificial significance in breath testing every effort should be made to implement method validation, data cross-testing and statistical validation along this process. The most common data analysis related problems can be classified into three groups: confounding variables (CVs), which have a real correlation with both the diseased state and a breath marker but lead to the erroneous conclusion that disease and breath are in a causal relationship; voodoo correlations (VCs), which can be understood as statistically true correlations that arise coincidentally in the vast number of measured variables; and statistical misconceptions in the study design (SMSD). CV: Typical confounding variables are environmental and medical history, host factors such as gender, age, weight, etc and parameters that could affect the quality of breath data such as subject breathing mode, effects of breath sampling and effects of the analytical technique itself. VC: The number of measured variables quickly overwhelms the number of samples that can feasibly be taken. As a consequence, the chances of finding coincidental 'voodoo' correlations grow proportionally. VCs can typically be expected in the following scenarios: insufficient number of patients, (too) many measurement variables, the use of advanced statistical data mining methods, and non-independent data for validation. SMSD: Non-prospective, non-blinded and non-randomized trials, a priori biased study populations or group selection with unrealistically high disease prevalence typically represent misconception of study design. In this paper important data interpretation issues are discussed, common pitfalls are addressed and directions for sound data processing and interpretation

  8. CONTEMPORARY ENVIRONMENTAL APPLICATIONS OF PHOTOGRAPHIC INTERPRETATION

    EPA Science Inventory

    Aerial Photographic Interpretation is a timed-tested technique for extracting landscape- level information from aerial photographs and other types of remotely sensed images. The U.S. Environmental Protection Agency's Environmental Photographic Interpretation Center (EPIC) has a 2...

  9. Data analysis using the Gnu R system for statistical computation

    SciTech Connect

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  10. Power spectra as a diagnostic tool in probing statistical/nonstatistical behavior in unimolecular reactions

    NASA Astrophysics Data System (ADS)

    Chang, Xiaoyen Y.; Sewell, Thomas D.; Raff, Lionel M.; Thompson, Donald L.

    1992-11-01

    The possibility of utilizing different types of power spectra obtained from classical trajectories as a diagnostic tool to identify the presence of nonstatistical dynamics is explored by using the unimolecular bond-fission reactions of 1,2-difluoroethane and the 2-chloroethyl radical as test cases. In previous studies, the reaction rates for these systems were calculated by using a variational transition-state theory and classical trajectory methods. A comparison of the results showed that 1,2-difluoroethane is a nonstatistical system, while the 2-chloroethyl radical behaves statistically. Power spectra for these two systems have been generated under various conditions. The characteristics of these spectra are as follows: (1) The spectra for the 2-chloroethyl radical are always broader and more coupled to other modes than is the case for 1,2-difluoroethane. This is true even at very low levels of excitation. (2) When an internal energy near or above the dissociation threshold is initially partitioned into a local C-H stretching mode, the power spectra for 1,2-difluoroethane broaden somewhat, but discrete and somewhat isolated bands are still clearly evident. In contrast, the analogous power spectra for the 2-chloroethyl radical exhibit a near complete absence of isolated bands. The general appearance of the spectrum suggests a very high level of mode-to-mode coupling, large intramolecular vibrational energy redistribution (IVR) rates, and global statistical behavior. (3) The appearance of the power spectrum for the 2-chloroethyl radical is unaltered regardless of whether the initial C-H excitation is in the CH2 or the CH2Cl group. This result also suggests statistical behavior. These results are interpreted to mean that power spectra may be used as a diagnostic tool to assess the statistical character of a system. The presence of a diffuse spectrum exhibiting a nearly complete loss of isolated structures indicates that the dissociation dynamics of the molecule will

  11. Statistical evaluation of forecasts.

    PubMed

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  12. Pain: A Statistical Account

    PubMed Central

    Thacker, Michael A.; Moseley, G. Lorimer

    2017-01-01

    Perception is seen as a process that utilises partial and noisy information to construct a coherent understanding of the world. Here we argue that the experience of pain is no different; it is based on incomplete, multimodal information, which is used to estimate potential bodily threat. We outline a Bayesian inference model, incorporating the key components of cue combination, causal inference, and temporal integration, which highlights the statistical problems in everyday perception. It is from this platform that we are able to review the pain literature, providing evidence from experimental, acute, and persistent phenomena to demonstrate the advantages of adopting a statistical account in pain. Our probabilistic conceptualisation suggests a principles-based view of pain, explaining a broad range of experimental and clinical findings and making testable predictions. PMID:28081134

  13. Statistical evaluation of forecasts

    NASA Astrophysics Data System (ADS)

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  14. Statistical prediction with Kanerva's sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Rogers, David

    1989-01-01

    A new viewpoint of the processing performed by Kanerva's sparse distributed memory (SDM) is presented. In conditions of near- or over-capacity, where the associative-memory behavior of the model breaks down, the processing performed by the model can be interpreted as that of a statistical predictor. Mathematical results are presented which serve as the framework for a new statistical viewpoint of sparse distributed memory and for which the standard formulation of SDM is a special case. This viewpoint suggests possible enhancements to the SDM model, including a procedure for improving the predictiveness of the system based on Holland's work with genetic algorithms, and a method for improving the capacity of SDM even when used as an associative memory.

  15. Bayesian statistical studies of the Ramachandran distribution.

    PubMed

    Pertsemlidis, Alexander; Zelinka, Jan; Fondon, John W; Henderson, R Keith; Otwinowski, Zbyszek

    2005-01-01

    We describe a method for the generation of knowledge-based potentials and apply it to the observed torsional angles of known protein structures. The potential is derived using Bayesian reasoning, and is useful as a prior for further such reasoning in the presence of additional data. The potential takes the form of a probability density function, which is described by a small number of coefficients with the number of necessary coefficients determined by tests based on statistical significance and entropy. We demonstrate the methods in deriving one such potential corresponding to two dimensions, the Ramachandran plot. In contrast to traditional histogram-based methods, the function is continuous and differentiable. These properties allow us to use the function as a force term in the energy minimization of appropriately described structures. The method can easily be extended to other observable angles and higher dimensions, or to include sequence dependence and should find applications in structure determination and validation.

  16. Systematic identification of statistically significant network measures

    NASA Astrophysics Data System (ADS)

    Ziv, Etay; Koytcheff, Robin; Middendorf, Manuel; Wiggins, Chris

    2005-01-01

    We present a graph embedding space (i.e., a set of measures on graphs) for performing statistical analyses of networks. Key improvements over existing approaches include discovery of “motif hubs” (multiple overlapping significant subgraphs), computational efficiency relative to subgraph census, and flexibility (the method is easily generalizable to weighted and signed graphs). The embedding space is based on scalars, functionals of the adjacency matrix representing the network. Scalars are global, involving all nodes; although they can be related to subgraph enumeration, there is not a one-to-one mapping between scalars and subgraphs. Improvements in network randomization and significance testing—we learn the distribution rather than assuming Gaussianity—are also presented. The resulting algorithm establishes a systematic approach to the identification of the most significant scalars and suggests machine-learning techniques for network classification.

  17. Statistical Aspects in Proteomic Biomarker Discovery.

    PubMed

    Jung, Klaus

    2016-01-01

    In the pursuit of a personalized medicine, i.e., the individual treatment of a patient, many medical decision problems are desired to be supported by biomarkers that can help to make a diagnosis, prediction, or prognosis. Proteomic biomarkers are of special interest since they can not only be detected in tissue samples but can also often be easily detected in diverse body fluids. Statistical methods play an important role in the discovery and validation of proteomic biomarkers. They are necessary in the planning of experiments, in the processing of raw signals, and in the final data analysis. This review provides an overview on the most frequent experimental settings including sample size considerations, and focuses on exploratory data analysis and classifier development.

  18. 1979 DOE statistical symposium

    SciTech Connect

    Gardiner, D.A.; Truett T.

    1980-09-01

    The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.

  19. Relativistic statistical arbitrage

    NASA Astrophysics Data System (ADS)

    Wissner-Gross, A. D.; Freer, C. E.

    2010-11-01

    Recent advances in high-frequency financial trading have made light propagation delays between geographically separated exchanges relevant. Here we show that there exist optimal locations from which to coordinate the statistical arbitrage of pairs of spacelike separated securities, and calculate a representative map of such locations on Earth. Furthermore, trading local securities along chains of such intermediate locations results in a novel econophysical effect, in which the relativistic propagation of tradable information is effectively slowed or stopped by arbitrage.

  20. Statistical Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Verde, L.

    2010-03-01

    The advent of large data-set in cosmology has meant that in the past 10 or 20 years our knowledge and understanding of the Universe has changed not only quantitatively but also, and most importantly, qualitatively. Cosmologists rely on data where a host of useful information is enclosed, but is encoded in a non-trivial way. The challenges in extracting this information must be overcome to make the most of a large experimental effort. Even after having converged to a standard cosmological model (the LCDM model) we should keep in mind that this model is described by 10 or more physical parameters and if we want to study deviations from it, the number of parameters is even larger. Dealing with such a high dimensional parameter space and finding parameters constraints is a challenge on itself. Cosmologists want to be able to compare and combine different data sets both for testing for possible disagreements (which could indicate new physics) and for improving parameter determinations. Finally, cosmologists in many cases want to find out, before actually doing the experiment, how much one would be able to learn from it. For all these reasons, sophisiticated statistical techniques are being employed in cosmology, and it has become crucial to know some statistical background to understand recent literature in the field. I will introduce some statistical tools that any cosmologist should know about in order to be able to understand recently published results from the analysis of cosmological data sets. I will not present a complete and rigorous introduction to statistics as there are several good books which are reported in the references. The reader should refer to those.