Science.gov

Sample records for easily interpretable statistics

  1. An interpretation of cloud overlap statistics

    NASA Astrophysics Data System (ADS)

    Tompkins, Andrian; Di Giuseppe, Francesca

    2015-04-01

    {Previous studies using ground-based and satellite observations show that the total cloud cover of cloudy layers separated by clear sky is close to, but can statistically exceed that given by the random overlap assumption, suggesting a tendency towards minimum overlap. In addition, vertically continuous clouds which are maximally overlapped in adjacent layers, decorrelate as the separation distance increases, with the resulting decorrelation length-scale found to be sensitive to the horizontal scale of the cloud scenes used to conduct the analysis. No satisfactory explanation has been given for the minimal overlap and scene-scale sensitivity of the cloud statistics. Using simple heuristic arguments, it is suggested that both these phenomena can be expected due to the statistical truncation that results from the omission of overcast cloudy layers from the analysis, which occurs more frequently as the scene length falls progressively below the typical cloud system scale. We first validate this claim using a easily interpreted system of repeating cyclic clouds sampled at various lengthscales, which reproduces both of the above phenoma. This analysis is then repeated with realistic fractal clouds from a cloud generator, which demonstrates that the degree of minimal overlap diagnosed in previous studies for is continuous clouds would result from sampling randomly overlapped clouds at spatial scales that are 30% to 80% of the cloud system scale. Based on this, a simple filter is suggested for cloudy scenes which removes the diagnosis of minimal overlap for discontinuous clouds, and results in a scene-length invariant calculation of the cloud overlap decorrelation for continuous clouds. Using CloudSat-CALIPSO data for 6 months, a scale-invariant decorrelation lengthscale of 3.7km is found. Using this filter we analyse a special application. By processing more than eight million cloud scenes from CloudSat observation in conjunction with co-located ECMWF analysis data we identify an empirical relationship between cloud overlap and wind-shear that can be applied to global models with confidence. The analysis confirms that clouds separated by clear sky gaps are randomly overlapped while continuous cloud layers decorrelate from maximum towards random overlap as the separation distance increases. There is a clear and systematic impact of wind-shear on the decorrelation length-scale, with cloud decorrelating over smaller distances as wind shear increases, as expected. A simple empirical linear-fit parametrisation is suggested that is straightforward to add to existing radiation schemes.

  2. Interpreting statistics of small lunar craters

    NASA Technical Reports Server (NTRS)

    Schultz, P. H.; Gault, D.; Greeley, R.

    1977-01-01

    Some of the wide variations in the crater-size distributions in lunar photography and in the resulting statistics were interpreted as different degradation rates on different surfaces, different scaling laws in different targets, and a possible population of endogenic craters. These possibilities are reexamined for statistics of 26 different regions. In contrast to most other studies, crater diameters as small as 5 m were measured from enlarged Lunar Orbiter framelets. According to the results of the reported analysis, the different crater distribution types appear to be most consistent with the hypotheses of differential degradation and a superposed crater population. Differential degradation can account for the low level of equilibrium in incompetent materials such as ejecta deposits, mantle deposits, and deep regoliths where scaling law changes and catastrophic processes introduce contradictions with other observations.

  3. The Statistical Interpretation of Entropy: An Activity

    ERIC Educational Resources Information Center

    Timmberlake, Todd

    2010-01-01

    The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the

  4. The Statistical Interpretation of Entropy: An Activity

    ERIC Educational Resources Information Center

    Timmberlake, Todd

    2010-01-01

    The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…

  5. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  6. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)

  7. On Interpreting Test Scores as Social Indicators: Statistical Considerations.

    ERIC Educational Resources Information Center

    Spencer, Bruce D.

    1983-01-01

    Because test scores are ordinal not cordinal attributes, the average test score often is a misleading way to summarize the scores of a group of individuals. Similarly, correlation coefficients may be misleading summary measures of association between test scores. Proper, readily interpretable, summary statistics are developed from a theory of…

  8. Integrating statistical rock physics and sedimentology for quantitative seismic interpretation

    NASA Astrophysics Data System (ADS)

    Avseth, Per; Mukerji, Tapan; Mavko, Gary; Gonzalez, Ezequiel

    This paper presents an integrated approach for seismic reservoir characterization that can be applied both in petroleum exploration and in hydrological subsurface analysis. We integrate fundamental concepts and models of rock physics, sedimentology, statistical pattern recognition, and information theory, with seismic inversions and geostatistics. Rock physics models enable us to link seismic amplitudes to geological facies and reservoir properties. Seismic imaging brings indirect, noninvasive, but nevertheless spatially exhaustive information about the reservoir properties that are not available from well data alone. Classification and estimation methods based on computational statistical techniques such as nonparametric Bayesian classification, Monte Carlo simulations and bootstrap, help to quantitatively measure the interpretation uncertainty and the mis-classification risk at each spatial location. Geostatistical stochastic simulations incorporate the spatial correlation and the small scale variability which is hard to capture with only seismic information because of the limits of resolution. Combining deterministic physical models with statistical techniques has provided us with a successful way of performing quantitative interpretation and estimation of reservoir properties from seismic data. These formulations identify not only the most likely interpretation but also the uncertainty of the interpretation, and serve as a guide for quantitative decision analysis. The methodology shown in this article is applied successfully to map petroleum reservoirs, and the examples are from relatively deeply buried oil fields. However, we suggest that this approach can also be carried out for improved characterization of shallow hydrologic aquifers using shallow seismic or GPR data.

  9. Comparing survival curves using an easy to interpret statistic.

    PubMed

    Hess, Kenneth R

    2010-10-15

    Here, I describe a statistic for comparing two survival curves that has a clear and obvious meaning and has a long history in biostatistics. Suppose we are comparing survival times associated with two treatments A and B. The statistic operates in such a way that if it takes on the value 0.95, then the interpretation is that a randomly chosen patient treated with A has a 95% chance of surviving longer than a randomly chosen patient treated with B. This statistic was first described in the 1950s, and was generalized in the 1960s to work with right-censored survival times. It is a useful and convenient measure for assessing differences between survival curves. Software for computing the statistic is readily available on the Internet. PMID:20732962

  10. Adapting internal statistical models for interpreting visual cues to depth

    PubMed Central

    Seydell, Anna; Knill, David C.; Trommershäuser, Julia

    2010-01-01

    The informativeness of sensory cues depends critically on statistical regularities in the environment. However, statistical regularities vary between different object categories and environments. We asked whether and how the brain changes the prior assumptions about scene statistics used to interpret visual depth cues when stimulus statistics change. Subjects judged the slants of stereoscopically presented figures by adjusting a virtual probe perpendicular to the surface. In addition to stereoscopic disparities, the aspect ratio of the stimulus in the image provided a “figural compression” cue to slant, whose reliability depends on the distribution of aspect ratios in the world. As we manipulated this distribution from regular to random and back again, subjects’ reliance on the compression cue relative to stereoscopic cues changed accordingly. When we randomly interleaved stimuli from shape categories (ellipses and diamonds) with different statistics, subjects gave less weight to the compression cue for figures from the category with more random aspect ratios. Our results demonstrate that relative cue weights vary rapidly as a function of recently experienced stimulus statistics, and that the brain can use different statistical models for different object categories. We show that subjects’ behavior is consistent with that of a broad class of Bayesian learning models. PMID:20465321

  11. Workplace statistical literacy for teachers: interpreting box plots

    NASA Astrophysics Data System (ADS)

    Pierce, Robyn; Chick, Helen

    2013-06-01

    As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the appropriate knowledge and experience to interpret the graphs, tables and other data that they receive. This study examined the statistical literacy demands placed on teachers, with a particular focus on box plot representations. Although box plots summarise the data in a way that makes visual comparisons possible across sets of data, this study showed that teachers do not always have the necessary fluency with the representation to describe correctly how the data are distributed in the representation. In particular, a significant number perceived the size of the regions of the box plot to be depicting frequencies rather than density, and there were misconceptions associated with outlying data that were not displayed on the plot. As well, teachers' perceptions of box plots were found to relate to three themes: attitudes, perceived value and misconceptions.

  12. Statistical Interpretation of Natural and Technological Hazards in China

    NASA Astrophysics Data System (ADS)

    Borthwick, Alistair, ,, Prof.; Ni, Jinren, ,, Prof.

    2010-05-01

    China is prone to catastrophic natural hazards from floods, droughts, earthquakes, storms, cyclones, landslides, epidemics, extreme temperatures, forest fires, avalanches, and even tsunami. This paper will list statistics related to the six worst natural disasters in China over the past 100 or so years, ranked according to number of fatalities. The corresponding data for the six worst natural disasters in China over the past decade will also be considered. [The data are abstracted from the International Disaster Database, Centre for Research on the Epidemiology of Disasters (CRED), Université Catholique de Louvain, Brussels, Belgium, http://www.cred.be/ where a disaster is defined as occurring if one of the following criteria is fulfilled: 10 or more people reported killed; 100 or more people reported affected; a call for international assistance; or declaration of a state of emergency.] The statistics include the number of occurrences of each type of natural disaster, the number of deaths, the number of people affected, and the cost in billions of US dollars. Over the past hundred years, the largest disasters may be related to the overabundance or scarcity of water, and to earthquake damage. However, there has been a substantial relative reduction in fatalities due to water related disasters over the past decade, even though the overall numbers of people affected remain huge, as does the economic damage. This change is largely due to the efforts put in by China's water authorities to establish effective early warning systems, the construction of engineering countermeasures for flood protection, the implementation of water pricing and other measures for reducing excessive consumption during times of drought. It should be noted that the dreadful death toll due to the Sichuan Earthquake dominates recent data. Joint research has been undertaken between the Department of Environmental Engineering at Peking University and the Department of Engineering Science at Oxford University on the production of zonation maps of certain natural hazards in China. Data at city and county level have been interpreted using a hierarchical system of indices, which are then ranked according to severity. Zonation maps will be presented for debris flows, landslide and rockfall hazards, flood risk in mainland China, and for soil erosion processes in the Yellow River basin. The worst debris flow hazards are to be found in southwest China as the land begins to become mountainous. Just over 20% of the land area is at high or very high risk of landslide and rockfall hazards, especially Yunnan, Sichuan, Gansu and Shannxi provinces. Flood risk is concentrated towards the eastern part of China, where the major rivers meet the sea. The paper will also consider data on technological disasters in China from 1900 to 2010, using data supplied by CRED. In terms of fatalities, industrial accidents appear to be dominated by explosion events. However, gas leaks have affected the largest number of people. Transport accidents are ranked in terms of fatalities as follows: water - road - rail - air. Fire is a major cause of loss of life, whereas chemical spills and poisoning seem to lead to fewer deaths.

  13. A Critique of Divorce Statistics and Their Interpretation.

    ERIC Educational Resources Information Center

    Crosby, John F.

    1980-01-01

    Increasingly, appeals to the divorce statistic are employed to substantiate claims that the family is in a state of breakdown and marriage is passe. This article contains a consideration of reasons why the divorce statistics are invalid and/or unreliable as indicators of the present state of marriage and family. (Author)

  14. Workplace Statistical Literacy for Teachers: Interpreting Box Plots

    ERIC Educational Resources Information Center

    Pierce, Robyn; Chick, Helen

    2013-01-01

    As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…

  15. A novel statistical analysis and interpretation of flow cytometry data

    PubMed Central

    Banks, H.T.; Kapraun, D.F.; Thompson, W. Clayton; Peligero, Cristina; Argilaguet, Jordi; Meyerhans, Andreas

    2013-01-01

    A recently developed class of models incorporating the cyton model of population generation structure into a conservation-based model of intracellular label dynamics is reviewed. Statistical aspects of the data collection process are quantified and incorporated into a parameter estimation scheme. This scheme is then applied to experimental data for PHA-stimulated CD4+ T and CD8+ T cells collected from two healthy donors. This novel mathematical and statistical framework is shown to form the basis for accurate, meaningful analysis of cellular behaviour for a population of cells labelled with the dye carboxyfluorescein succinimidyl ester and stimulated to divide. PMID:23826744

  16. Statistical characteristics of MST radar echoes and its interpretation

    NASA Technical Reports Server (NTRS)

    Woodman, Ronald F.

    1989-01-01

    Two concepts of fundamental importance are reviewed: the autocorrelation function and the frequency power spectrum. In addition, some turbulence concepts, the relationship between radar signals and atmospheric medium statistics, partial reflection, and the characteristics of noise and clutter interference are discussed.

  17. Interpretation of Statistical Significance Testing: A Matter of Perspective.

    ERIC Educational Resources Information Center

    McClure, John; Suen, Hoi K.

    1994-01-01

    This article compares three models that have been the foundation for approaches to the analysis of statistical significance in early childhood research--the Fisherian and the Neyman-Pearson models (both considered "classical" approaches), and the Bayesian model. The article concludes that all three models have a place in the analysis of research…

  18. Statistical Interpretation of the Local Field Inside Dielectrics.

    ERIC Educational Resources Information Center

    Berrera, Ruben G.; Mello, P. A.

    1982-01-01

    Compares several derivations of the Clausius-Mossotti relation to analyze consistently the nature of approximations used and their range of applicability. Also presents a statistical-mechanical calculation of the local field for classical system of harmonic oscillators interacting via the Coulomb potential. (Author/SK)

  19. Interpreting the flock algorithm from a statistical perspective.

    PubMed

    Anderson, Eric C; Barry, Patrick D

    2015-09-01

    We show that the algorithm in the program flock (Duchesne & Turgeon 2009) can be interpreted as an estimation procedure based on a model essentially identical to the structure (Pritchard et al. 2000) model with no admixture and without correlated allele frequency priors. Rather than using MCMC, the flock algorithm searches for the maximum a posteriori estimate of this structure model via a simulated annealing algorithm with a rapid cooling schedule (namely, the exponent on the objective function →∞). We demonstrate the similarities between the two programs in a two-step approach. First, to enable rapid batch processing of many simulated data sets, we modified the source code of structure to use the flock algorithm, producing the program flockture. With simulated data, we confirmed that results obtained with flock and flockture are very similar (though flockture is some 200 times faster). Second, we simulated multiple large data sets under varying levels of population differentiation for both microsatellite and SNP genotypes. We analysed them with flockture and structure and assessed each program on its ability to cluster individuals to their correct subpopulation. We show that flockture yields results similar to structure albeit with greater variability from run to run. flockture did perform better than structure when genotypes were composed of SNPs and differentiation was moderate (FST= 0.022-0.032). When differentiation was low, structure outperformed flockture for both marker types. On large data sets like those we simulated, it appears that flock's reliance on inference rules regarding its 'plateau record' is not helpful. Interpreting flock's algorithm as a special case of the model in structure should aid in understanding the program's output and behaviour. PMID:25913195

  20. Impact of Equity Models and Statistical Measures on Interpretations of Educational Reform

    ERIC Educational Resources Information Center

    Rodriguez, Idaykis; Brewe, Eric; Sawtelle, Vashti; Kramer, Laird H.

    2012-01-01

    We present three models of equity and show how these, along with the statistical measures used to evaluate results, impact interpretation of equity in education reform. Equity can be defined and interpreted in many ways. Most equity education reform research strives to achieve equity by closing achievement gaps between groups. An example is given…

  1. Statistical Significance Testing from Three Perspectives and Interpreting Statistical Significance and Nonsignificance and the Role of Statistics in Research.

    ERIC Educational Resources Information Center

    Levin, Joel R.; And Others

    1993-01-01

    Journal editors respond to criticisms of reliance on statistical significance in research reporting. Joel R. Levin ("Journal of Educational Psychology") defends its use, whereas William D. Schafer ("Measurement and Evaluation in Counseling and Development") emphasizes the distinction between statistically significant and important. William Asher…

  2. On the physical interpretation of statistical data from black-box systems

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo I.; Cohen, Morrel H.

    2013-07-01

    In this paper we explore the physical interpretation of statistical data collected from complex black-box systems. Given the output statistics of a black-box system, and considering a class of relevant Markov dynamics which are physically meaningful, we reverse-engineer the Markov dynamics to obtain an equilibrium distribution that coincides with the output statistics observed. This reverse-engineering scheme provides us with a conceptual physical interpretation of the black-box system investigated. Five specific reverse-engineering methodologies are developed, based on the following dynamics: Langevin, geometric Langevin, diffusion, growth-collapse, and decay-surge. In turn, these methodologies yield physical interpretations of the black-box system in terms of conceptual intrinsic forces, temperatures, and instabilities. The application of these methodologies is exemplified in the context of the distribution of wealth and income in human societies, which are outputs of the complex black-box system called “the economy”.

  3. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    ERIC Educational Resources Information Center

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate unless "corrected" effect…

  4. Report: New analytical and statistical approaches for interpreting the relationships among environmental stressors and biomarkers

    EPA Science Inventory

    The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical, and statistical perspectives (Pleil et al. 2014; Sobus et al. 2011...

  5. New physicochemical interpretations for the adsorption of food dyes on chitosan films using statistical physics treatment.

    PubMed

    Dotto, G L; Pinto, L A A; Hachicha, M A; Knani, S

    2015-03-15

    In this work, statistical physics treatment was employed to study the adsorption of food dyes onto chitosan films, in order to obtain new physicochemical interpretations at molecular level. Experimental equilibrium curves were obtained for the adsorption of four dyes (FD&C red 2, FD&C yellow 5, FD&C blue 2, Acid Red 51) at different temperatures (298, 313 and 328 K). A statistical physics formula was used to interpret these curves, and the parameters such as, number of adsorbed dye molecules per site (n), anchorage number (n'), receptor sites density (NM), adsorbed quantity at saturation (N asat), steric hindrance (?), concentration at half saturation (c1/2) and molar adsorption energy (?E(a)) were estimated. The relation of the above mentioned parameters with the chemical structure of the dyes and temperature was evaluated and interpreted. PMID:25308634

  6. Impact of equity models and statistical measures on interpretations of educational reform

    NASA Astrophysics Data System (ADS)

    Rodriguez, Idaykis; Brewe, Eric; Sawtelle, Vashti; Kramer, Laird H.

    2012-12-01

    We present three models of equity and show how these, along with the statistical measures used to evaluate results, impact interpretation of equity in education reform. Equity can be defined and interpreted in many ways. Most equity education reform research strives to achieve equity by closing achievement gaps between groups. An example is given by the study by Lorenzo et al. that shows that interactive engagement methods lead to increased gender equity. In this paper, we reexamine the results of Lorenzo et al. through three models of equity. We find that interpretation of the results strongly depends on the model of equity chosen. Further, we argue that researchers must explicitly state their model of equity as well as use effect size measurements to promote clarity in education reform.

  7. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  8. Two Easily Made Astronomical Telescopes.

    ERIC Educational Resources Information Center

    Hill, M.; Jacobs, D. J.

    1991-01-01

    The directions and diagrams for making a reflecting telescope and a refracting telescope are presented. These telescopes can be made by students out of plumbing parts and easily obtainable, inexpensive, optical components. (KR)

  9. An Easily Constructed Dodecahedron Model.

    ERIC Educational Resources Information Center

    Yamana, Shukichi

    1984-01-01

    A model of a dodecahedron which is necessary for teaching stereochemistry (for example, that of dodecahedrane) can be made easily by using a sealed, empty envelope. The steps necessary for accomplishing this task are presented. (JN)

  10. An Easily Constructed Cube Model.

    ERIC Educational Resources Information Center

    Yamana, Shukichi; Kawaguchi, Makoto

    1984-01-01

    A model of a cube which is necessary for teaching stereochemistry (especially of inorganic compounds) can be made easily, by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  11. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  12. Editorial: new analytical and statistical approaches for interpreting the relationships among environmental stressors and biomarkers.

    PubMed

    Bean, Heather D; Pleil, Joachim D; Hill, Jane E

    2015-02-01

    The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical and statistical perspectives. A second concern is how the environment interacts with human systems biology, what the variability is in "normal" subjects, and how such biological observations might be reconstructed to infer external stressors. In this article, we report on recent research presentations from a symposium at the 248th American Chemical Society meeting held in San Francisco, 10-14 August 2014, that focused on providing some insight into these important issues. PMID:25444302

  13. Misuse of statistics in the interpretation of data on low-level radiation

    SciTech Connect

    Hamilton, L.D.

    1982-01-01

    Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.

  14. Interpretations

    NASA Astrophysics Data System (ADS)

    Bellac, Michel Le

    2014-11-01

    Although nobody can question the practical efficiency of quantum mechanics, there remains the serious question of its interpretation. As Valerio Scarani puts it, "We do not feel at ease with the indistinguishability principle (that is, the superposition principle) and some of its consequences." Indeed, this principle which pervades the quantum world is in stark contradiction with our everyday experience. From the very beginning of quantum mechanics, a number of physicists--but not the majority of them!--have asked the question of its "interpretation". One may simply deny that there is a problem: according to proponents of the minimalist interpretation, quantum mechanics is self-sufficient and needs no interpretation. The point of view held by a majority of physicists, that of the Copenhagen interpretation, will be examined in Section 10.1. The crux of the problem lies in the status of the state vector introduced in the preceding chapter to describe a quantum system, which is no more than a symbolic representation for the Copenhagen school of thought. Conversely, one may try to attribute some "external reality" to this state vector, that is, a correspondence between the mathematical description and the physical reality. In this latter case, it is the measurement problem which is brought to the fore. In 1932, von Neumann was first to propose a global approach, in an attempt to build a purely quantum theory of measurement examined in Section 10.2. This theory still underlies modern approaches, among them those grounded on decoherence theory, or on the macroscopic character of the measuring apparatus: see Section 10.3. Finally, there are non-standard interpretations such as Everett's many worlds theory or the hidden variables theory of de Broglie and Bohm (Section 10.4). Note, however, that this variety of interpretations has no bearing whatsoever on the practical use of quantum mechanics. There is no controversy on the way we should use quantum mechanics!

  15. Data Analysis and Statistical Methods for the Assessment and Interpretation of Geochronologic Data

    NASA Astrophysics Data System (ADS)

    Reno, B. L.; Brown, M.; Piccoli, P. M.

    2007-12-01

    Ages are traditionally reported as a weighted mean with an uncertainty based on least squares analysis of analytical error on individual dates. This method does not take into account geological uncertainties, and cannot accommodate asymmetries in the data. In most instances, this method will understate uncertainty on a given age, which may lead to over interpretation of age data. Geologic uncertainty is difficult to quantify, but is typically greater than analytical uncertainty. These factors make traditional statistical approaches inadequate to fully evaluate geochronologic data. We propose a protocol to assess populations within multi-event datasets and to calculate age and uncertainty from each population of dates interpreted to represent a single geologic event using robust and resistant statistical methods. To assess whether populations thought to represent different events are statistically separate exploratory data analysis is undertaken using a box plot, where the range of the data is represented by a 'box' of length given by the interquartile range, divided at the median of the data, with 'whiskers' that extend to the furthest datapoint that lies within 1.5 times the interquartile range beyond the box. If the boxes representing the populations do not overlap, they are interpreted to represent statistically different sets of dates. Ages are calculated from statistically distinct populations using a robust tool such as the tanh method of Kelsey et al. (2003, CMP, 146, 326-340), which is insensitive to any assumptions about the underlying probability distribution from which the data are drawn. Therefore, this method takes into account the full range of data, and is not drastically affected by outliers. The interquartile range of each population of dates (the interquartile range) gives a first pass at expressing uncertainty, which accommodates asymmetry in the dataset; outliers have a minor affect on the uncertainty. To better quantify the uncertainty, a resistant tool that is insensitive to local misbehavior of data is preferred, such as the normalized median absolute deviations proposed by Powell et al. (2002, Chem Geol, 185, 191-204). We illustrate the method using a dataset of 152 monazite dates determined using EPMA chemical data from a single sample from the Neoproterozoic Brasília Belt, Brazil. Results are compared with ages and uncertainties calculated using traditional methods to demonstrate the differences. The dataset was manually culled into three populations representing discrete compositional domains within chemically-zoned monazite grains. The weighted mean ages and least squares uncertainties for these populations are 633±6 (2σ) Ma for a core domain, 614±5 (2σ) Ma for an intermediate domain and 595±6 (2σ) Ma for a rim domain. Probability distribution plots indicate asymmetric distributions of all populations, which cannot be accounted for with traditional statistical tools. These three domains record distinct ages outside the interquartile range for each population of dates, with the core domain lying in the subrange 642-624 Ma, the intermediate domain 617-609 Ma and the rim domain 606-589 Ma. The tanh estimator yields ages of 631±7 (2σ) for the core domain, 616±7 (2σ) for the intermediate domain and 601±8 (2σ) for the rim domain. Whereas the uncertainties derived using a resistant statistical tool are larger than those derived from traditional statistical tools, the method yields more realistic uncertainties that better address the spread in the dataset and account for asymmetry in the data.

  16. Soil VisNIR chemometric performance statistics should be interpreted as random variables

    NASA Astrophysics Data System (ADS)

    Brown, David J.; Gasch, Caley K.; Poggio, Matteo; Morgan, Cristine L. S.

    2015-04-01

    Chemometric models are normally evaluated using performance statistics such as the Standard Error of Prediction (SEP) or the Root Mean Squared Error of Prediction (RMSEP). These statistics are used to evaluate the quality of chemometric models relative to other published work on a specific soil property or to compare the results from different processing and modeling techniques (e.g. Partial Least Squares Regression or PLSR and random forest algorithms). Claims are commonly made about the overall success of an application or the relative performance of different modeling approaches assuming that these performance statistics are fixed population parameters. While most researchers would acknowledge that small differences in performance statistics are not important, rarely are performance statistics treated as random variables. Given that we are usually comparing modeling approaches for general application, and given that the intent of VisNIR soil spectroscopy is to apply chemometric calibrations to larger populations than are included in our soil-spectral datasets, it is more appropriate to think of performance statistics as random variables with variation introduced through the selection of samples for inclusion in a given study and through the division of samples into calibration and validation sets (including spiking approaches). Here we look at the variation in VisNIR performance statistics for the following soil-spectra datasets: (1) a diverse US Soil Survey soil-spectral library with 3768 samples from all 50 states and 36 different countries; (2) 389 surface and subsoil samples taken from US Geological Survey continental transects; (3) the Texas Soil Spectral Library (TSSL) with 3000 samples; (4) intact soil core scans of Texas soils with 700 samples; (5) approximately 400 in situ scans from the Pacific Northwest region; and (6) miscellaneous local datasets. We find the variation in performance statistics to be surprisingly large. This has important implications for the interpretation of soil VisNIR model results. Particularly for smaller datasets, the relative success of a given application or modeling approach may well be due in part to chance.

  17. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    NASA Astrophysics Data System (ADS)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  18. Parameter Interpretation and Reduction for a Unified Statistical Mechanical Surface Tension Model.

    PubMed

    Boyer, Hallie; Wexler, Anthony; Dutcher, Cari S

    2015-09-01

    Surface properties of aqueous solutions are important for environments as diverse as atmospheric aerosols and biocellular membranes. Previously, we developed a surface tension model for both electrolyte and nonelectrolyte aqueous solutions across the entire solute concentration range (Wexler and Dutcher, J. Phys. Chem. Lett. 2013, 4, 1723-1726). The model differentiated between adsorption of solute molecules in the bulk and surface of solution using the statistical mechanics of multilayer sorption solution model of Dutcher et al. (J. Phys. Chem. A 2013, 117, 3198-3213). The parameters in the model had physicochemical interpretations, but remained largely empirical. In the current work, these parameters are related to solute molecular properties in aqueous solutions. For nonelectrolytes, sorption tendencies suggest a strong relation with molecular size and functional group spacing. For electrolytes, surface adsorption of ions follows ion surface-bulk partitioning calculations by Pegram and Record (J. Phys. Chem. B 2007, 111, 5411-5417). PMID:26275040

  19. Crossing statistic: Bayesian interpretation, model selection and resolving dark energy parametrization problem

    SciTech Connect

    Shafieloo, Arman

    2012-05-01

    By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties.

  20. Easily Constructed Microscale Spectroelectrochemical Cell

    PubMed Central

    Strickland, Jordan C.

    2013-01-01

    The design and performance of an easily constructed cell for microscale spectroelectrochemical analysis is described. A cation exchange polymer film, Nafion, was used as a salt bridge to provide ionic contact between a small sample well containing a coiled wire working electrode and separate, larger wells housing reference and auxiliary electrodes. The cell was evaluated using aqueous ferri/ferrocyanide as a test system and shown to be capable of relatively sensitive visible absorption measurements (path lengths on the order of millimeters) and reasonably rapid bulk electrolysis (~ 5 min) of samples in the 1 to 5 μL volume range. Minor alterations to the cell design are cited that could allow for analysis of sub-microliter volumes, rapid multi-sample analysis, and measurements in the ultraviolet spectral region. PMID:24058214

  1. Differences in paleomagnetic interpretations due to the choice of statistical, demagnetization and correction techniques: Kapuskasing Structural Zone, northern Ontario, Canada

    NASA Astrophysics Data System (ADS)

    Borradaile, Graham J.; Werner, Tomasz; Lagroix, France

    2003-02-01

    The Kapuskasing Structural Zone (KSZ) reveals a section through the Archean lower crustal granoblastic gneisses. Our new paleomagnetic data largely agree with previous work but we show that interpretations vary according to the choices of statistical, demagnetization and field-correction techniques. First, where the orientation distribution of characteristic remanence directions on the sphere is not symmetrically circular, the commonly used statistical model is invalid [Fisher, R.A., Proc. R. Soc. A217 (1953) 295]. Any tendency to form an elliptical distribution indicates that the sample is drawn from a Bingham-type population [Bingham, C., 1964. Distributions on the sphere and on the projective plane. PhD thesis, Yale University]. Fisher and Bingham statistics produce different confidence estimates from the same data and the traditionally defined mean vector may differ from the maximum eigenvector of an orthorhombic Bingham distribution. It seems prudent to apply both models wherever a non-Fisher population is suspected and that may be appropriate in any tectonized rocks. Non-Fisher populations require larger sample sizes so that focussing on individual sites may not be the most effective policy in tectonized rocks. More dispersed sampling across tectonic structures may be more productive. Second, from the same specimens, mean vectors isolated by thermal and alternating field (AF) demagnetization differ. Which treatment gives more meaningful results is difficult to decipher, especially in metamorphic rocks where the history of the magnetic minerals is not easily related to the ages of tectonic and petrological events. In this study, thermal demagnetization gave lower inclinations for paleomagnetic vectors and thus more distant paleopoles. Third, of more parochial significance, tilt corrections may be unnecessary in the KSZ because magnetic fabrics and thrust ramp are constant in orientation to the depth at which they level off, at approximately 15-km depth. With Archean geothermal gradients, primary remanences were blocked after the foliation was tilted to rise on the thrust ramp. Therefore, the rocks were probably magnetized in their present orientation; tilting largely or entirely predates magnetization.

  2. A Statistical Framework for the Interpretation of mtDNA Mixtures: Forensic and Medical Applications

    PubMed Central

    Egeland, Thore; Salas, Antonio

    2011-01-01

    Background Mitochondrial DNA (mtDNA) variation is commonly analyzed in a wide range of different biomedical applications. Cases where more than one individual contribute to a stain genotyped from some biological material give rise to a mixture. Most forensic mixture cases are analyzed using autosomal markers. In rape cases, Y-chromosome markers typically add useful information. However, there are important cases where autosomal and Y-chromosome markers fail to provide useful profiles. In some instances, usually involving small amounts or degraded DNA, mtDNA may be the only useful genetic evidence available. Mitochondrial DNA mixtures also arise in studies dealing with the role of mtDNA variation in tumorigenesis. Such mixtures may be generated by the tumor, but they could also originate in vitro due to inadvertent contamination or a sample mix-up. Methods/Principal Findings We present the statistical methods needed for mixture interpretation and emphasize the modifications required for the more well-known methods based on conventional markers to generalize to mtDNA mixtures. Two scenarios are considered. Firstly, only categorical mtDNA data is assumed available, that is, the variants contributing to the mixture. Secondly, quantitative data (peak heights or areas) on the allelic variants are also accessible. In cases where quantitative information is available in addition to allele designation, it is possible to extract more precise information by using regression models. More precisely, using quantitative information may lead to a unique solution in cases where the qualitative approach points to several possibilities. Importantly, these methods also apply to clinical cases where contamination is a potential alternative explanation for the data. Conclusions/Significance We argue that clinical and forensic scientists should give greater consideration to mtDNA for mixture interpretation. The results and examples show that the analysis of mtDNA mixtures contributes substantially to forensic casework and may also clarify erroneous claims made in clinical genetics regarding tumorigenesis. PMID:22053205

  3. Uses and Misuses of Student Evaluations of Teaching: The Interpretation of Differences in Teaching Evaluation Means Irrespective of Statistical Information

    ERIC Educational Resources Information Center

    Boysen, Guy A.

    2015-01-01

    Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…

  4. Dose impact in radiographic lung injury following lung SBRT: Statistical analysis and geometric interpretation

    SciTech Connect

    Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan

    2014-03-15

    Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.

  5. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    ERIC Educational Resources Information Center

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  6. Statistics for the time-dependent failure of Kevlar-49/epoxy composites: micromechanical modeling and data interpretation

    SciTech Connect

    Phoenix, S.L.; Wu, E.M.

    1983-03-01

    This paper presents some new data on the strength and stress-rupture of Kevlar-49 fibers, fiber/epoxy strands and pressure vessels, and consolidated data obtained at LLNL over the past 10 years. This data are interpreted by using recent theoretical results from a micromechanical model of the statistical failure process, thereby gaining understanding of the roles of the epoxy matrix and ultraviolet radiation on long term lifetime.

  7. Boyle temperature as a point of ideal gas in gentile statistics and its economic interpretation

    NASA Astrophysics Data System (ADS)

    Maslov, V. P.; Maslova, T. V.

    2014-07-01

    Boyle temperature is interpreted as the temperature at which the formation of dimers becomes impossible. To Irving Fisher's correspondence principle we assign two more quantities: the number of degrees of freedom, and credit. We determine the danger level of the mass of money M when the mutual trust between economic agents begins to fall.

  8. Feature combination networks for the interpretation of statistical machine learning models: application to Ames mutagenicity

    PubMed Central

    2014-01-01

    Background A new algorithm has been developed to enable the interpretation of black box models. The developed algorithm is agnostic to learning algorithm and open to all structural based descriptors such as fragments, keys and hashed fingerprints. The algorithm has provided meaningful interpretation of Ames mutagenicity predictions from both random forest and support vector machine models built on a variety of structural fingerprints. A fragmentation algorithm is utilised to investigate the model’s behaviour on specific substructures present in the query. An output is formulated summarising causes of activation and deactivation. The algorithm is able to identify multiple causes of activation or deactivation in addition to identifying localised deactivations where the prediction for the query is active overall. No loss in performance is seen as there is no change in the prediction; the interpretation is produced directly on the model’s behaviour for the specific query. Results Models have been built using multiple learning algorithms including support vector machine and random forest. The models were built on public Ames mutagenicity data and a variety of fingerprint descriptors were used. These models produced a good performance in both internal and external validation with accuracies around 82%. The models were used to evaluate the interpretation algorithm. Interpretation was revealed that links closely with understood mechanisms for Ames mutagenicity. Conclusion This methodology allows for a greater utilisation of the predictions made by black box models and can expedite further study based on the output for a (quantitative) structure activity model. Additionally the algorithm could be utilised for chemical dataset investigation and knowledge extraction/human SAR development. PMID:24661325

  9. Statistical Approaches for Enhancing Causal Interpretation of the M to Y Relation in Mediation Analysis

    PubMed Central

    MacKinnon, David P.; Pirlott, Angela G.

    2016-01-01

    Statistical mediation methods provide valuable information about underlying mediating psychological processes, but the ability to infer that the mediator variable causes the outcome variable is more complex than widely known. Researchers have recently emphasized how violating assumptions about confounder bias severely limits causal inference of the mediator to dependent variable relation. Our article describes and addresses these limitations by drawing on new statistical developments in causal mediation analysis. We first review the assumptions underlying causal inference and discuss three ways to examine the effects of confounder bias when assumptions are violated. We then describe four approaches to address the influence of confounding variables and enhance causal inference, including comprehensive structural equation models, instrumental variable methods, principal stratification, and inverse probability weighting. Our goal is to further the adoption of statistical methods to enhance causal inference in mediation studies. PMID:25063043

  10. Statistics Translated: A Step-by-Step Guide to Analyzing and Interpreting Data

    ERIC Educational Resources Information Center

    Terrell, Steven R.

    2012-01-01

    Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying…

  11. US Geological Survey nutrient preservation experiment : experimental design, statistical analysis, and interpretation of analytical results

    USGS Publications Warehouse

    Patton, Charles J.; Gilroy, Edward J.

    1999-01-01

    Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.

  12. Statistical Model for the Interpretation of Evidence for Bio-Signatures Simulated in virtual Mars Samples.

    NASA Astrophysics Data System (ADS)

    Mani, Peter; Heuer, Markus; Hofmann, Beda A.; Milliken, Kitty L.; West, Julia M.

    This paper evaluates a mathematical model of bio-signature search processes on Mars samples returned to Earth and studied inside a Mars Sample Return Facility (MSRF). Asimple porosity model for a returned Mars sample, based on initial observations on Mars meteorites, has been stochastically simulated and the data analysed in a computer study. The resulting false positive, true negative and false negative values - as a typical output of the simulations - was statistically analysed. The results were used in Bayes’ statistics to correct the a-priori probability of presence of bio-signature and the resulting posteriori probability was used in turn to improve the initial assumption of the value of extra-terrestrial presence for life forms in Mars material. Such an iterative algorithm can lead to a better estimate of the positive predictive value for life on Mars and therefore, together with Poisson statistics for a null result, it should be possible to bound the probability for the presence of extra-terrestrial bio-signatures to an upper level.

  13. Statistical and Methodological Considerations for the Interpretation of Intranasal Oxytocin Studies.

    PubMed

    Walum, Hasse; Waldman, Irwin D; Young, Larry J

    2016-02-01

    Over the last decade, oxytocin (OT) has received focus in numerous studies associating intranasal administration of this peptide with various aspects of human social behavior. These studies in humans are inspired by animal research, especially in rodents, showing that central manipulations of the OT system affect behavioral phenotypes related to social cognition, including parental behavior, social bonding, and individual recognition. Taken together, these studies in humans appear to provide compelling, but sometimes bewildering, evidence for the role of OT in influencing a vast array of complex social cognitive processes in humans. In this article, we investigate to what extent the human intranasal OT literature lends support to the hypothesis that intranasal OT consistently influences a wide spectrum of social behavior in humans. We do this by considering statistical features of studies within this field, including factors like statistical power, prestudy odds, and bias. Our conclusion is that intranasal OT studies are generally underpowered and that there is a high probability that most of the published intranasal OT findings do not represent true effects. Thus, the remarkable reports that intranasal OT influences a large number of human social behaviors should be viewed with healthy skepticism, and we make recommendations to improve the reliability of human OT studies in the future. PMID:26210057

  14. Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays

    NASA Astrophysics Data System (ADS)

    Sibatov, R. T.

    2011-08-01

    A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.

  15. A statistical approach to the interpretation of aliphatic hydrocarbon distributions in marine sediments

    USGS Publications Warehouse

    Rapp, J.B.

    1991-01-01

    Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.

  16. Double precision errors in the logistic map: Statistical study and dynamical interpretation

    NASA Astrophysics Data System (ADS)

    Oteo, J. A.; Ros, J.

    2007-09-01

    The nature of the round-off errors that occur in the usual double precision computation of the logistic map is studied in detail. Different iterative regimes from the whole panoply of behaviors exhibited in the bifurcation diagram are examined, histograms of errors in trajectories given, and for the case of fully developed chaos an explicit formula is found. It is shown that the statistics of the largest double precision error as a function of the map parameter is characterized by jumps whose location is determined by certain boundary crossings in the bifurcation diagram. Both jumps and locations seem to present geometric convergence characterized by the two first Feigenbaum constants. Eventually, a comparison with Benford’s law for the distribution of the leading digit of compilation of numbers is discussed.

  17. The statistical analysis and interpretation of imperfectly-fitted Rb-Sr isochrons from polymetamorphic terrains

    NASA Astrophysics Data System (ADS)

    Cameron, M.; Collerson, K. D.; Compston, W.; Morton, R.

    1981-07-01

    Rb-Sr isotopic data for large, relatively homogeneous, whole-rock samples of Uivak 1 gneiss from the Saglek-Hebron area of northern Labrador exhibit a scatter which exceeds that predicted by experimental error. Isotopic analyses of adjacent compositionally-different layers of Uivak gneiss, 1-2 cm in width, define secondary isochrons, with slopes corresponding to an age of ca. 1800 Ma. As field evidence combined with previous isotopic dating demonstrates that the compositional layering did not form at this time, the secondary isochrons are interpreted as resulting from localized Sr-isotopic homogenization along 87Sr abundance gradients generated by ageing in the previously-layered gneisses. The geological scatter in the larger gneiss specimens is therefore attributed to the same phenomenon on a reduced scale, viz. Sr isotopic equilibration at 1800 Ma between adjacent volumes of gneiss. However regional differences in mean 87Sr /86Sr and mean 87Rb /86Sr are assumed to be unchanged. This interpretation has led us to the development of a weighted least squares regression technique that utilizes the geologically-induced error structure in the whole-rock Rb-Sr data. The method encompasses three models. In the first, the ' Local Isotopic Equilibrium' or ' Free-line' model, it is assumed that the error structure in the whole-rock samples is the same as that in the layered gneisses except for an unknown scaling factor, common to both 87Rb /86Sr and 87Sr /86Sr . In the other two models, the initial strontium isotopic composition is constrained to values greater than primordial Sr and corresponding to that expected for the contemporary 'Bulk Earth' by forcing the least squares fit for the data to pass through a fixed-point that corresponds to the present-day 87Rb /86Sr and 87Sr /86Sr composition of the Bulk Earth viz. approximately 0.085 and 0.7047 respectively. In the first of these models, ' Bulk Earth Model 1', no assumptions are made about the error structure of the primary data. An estimate of the age can be obtained but no estimate of its uncertainty. In the second, ' Bulk Earth Model 2', the first two methods are combined and estimates for both the age and its 95% confidence limits may be found. In developing the method, nineteen whole-rock samples plus the means of the slabbed gneisses yielded the following results for the age and initial 87Sr /86Sr , respectively, of the Uivak 1 gneisses: (1) Free-line Model: 3621 -410+686 Ma, 0.70006 -565+354 (2) Bulk Earth Model 1: 3606 Ma, 0.70020 (3) Bulk Earth Model 2: 3606 -175+213Ma, 0.70020 -27+22

  18. A COMPREHENSIVE STATISTICALLY-BASED METHOD TO INTERPRET REAL-TIME FLOWING MEASUREMENTS

    SciTech Connect

    Pinan Dawkrajai; Analis A. Romero; Keita Yoshioka; Ding Zhu; A.D. Hill; Larry W. Lake

    2004-10-01

    In this project, we are developing new methods for interpreting measurements in complex wells (horizontal, multilateral and multi-branching wells) to determine the profiles of oil, gas, and water entry. These methods are needed to take full advantage of ''smart'' well instrumentation, a technology that is rapidly evolving to provide the ability to continuously and permanently monitor downhole temperature, pressure, volumetric flow rate, and perhaps other fluid flow properties at many locations along a wellbore; and hence, to control and optimize well performance. In this first year, we have made considerable progress in the development of the forward model of temperature and pressure behavior in complex wells. In this period, we have progressed on three major parts of the forward problem of predicting the temperature and pressure behavior in complex wells. These three parts are the temperature and pressure behaviors in the reservoir near the wellbore, in the wellbore or laterals in the producing intervals, and in the build sections connecting the laterals, respectively. Many models exist to predict pressure behavior in reservoirs and wells, but these are almost always isothermal models. To predict temperature behavior we derived general mass, momentum, and energy balance equations for these parts of the complex well system. Analytical solutions for the reservoir and wellbore parts for certain special conditions show the magnitude of thermal effects that could occur. Our preliminary sensitivity analyses show that thermal effects caused by near-wellbore reservoir flow can cause temperature changes that are measurable with smart well technology. This is encouraging for the further development of the inverse model.

  19. A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements

    SciTech Connect

    Pinan Dawkrajai; Keita Yoshioka; Analis A. Romero; Ding Zhu; A.D. Hill; Larry W. Lake

    2005-10-01

    This project is motivated by the increasing use of distributed temperature sensors for real-time monitoring of complex wells (horizontal, multilateral and multi-branching wells) to infer the profiles of oil, gas, and water entry. Measured information can be used to interpret flow profiles along the wellbore including junction and build section. In this second project year, we have completed a forward model to predict temperature and pressure profiles in complex wells. As a comprehensive temperature model, we have developed an analytical reservoir flow model which takes into account Joule-Thomson effects in the near well vicinity and multiphase non-isothermal producing wellbore model, and couples those models accounting mass and heat transfer between them. For further inferences such as water coning or gas evaporation, we will need a numerical non-isothermal reservoir simulator, and unlike existing (thermal recovery, geothermal) simulators, it should capture subtle temperature change occurring in a normal production. We will show the results from the analytical coupled model (analytical reservoir solution coupled with numerical multi-segment well model) to infer the anomalous temperature or pressure profiles under various conditions, and the preliminary results from the numerical coupled reservoir model which solves full matrix including wellbore grids. We applied Ramey's model to the build section and used an enthalpy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section.

  20. A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements

    SciTech Connect

    Keita Yoshioka; Pinan Dawkrajai; Analis A. Romero; Ding Zhu; A. D. Hill; Larry W. Lake

    2007-01-15

    With the recent development of temperature measurement systems, continuous temperature profiles can be obtained with high precision. Small temperature changes can be detected by modern temperature measuring instruments such as fiber optic distributed temperature sensor (DTS) in intelligent completions and will potentially aid the diagnosis of downhole flow conditions. In vertical wells, since elevational geothermal changes make the wellbore temperature sensitive to the amount and the type of fluids produced, temperature logs can be used successfully to diagnose the downhole flow conditions. However, geothermal temperature changes along the wellbore being small for horizontal wells, interpretations of a temperature log become difficult. The primary temperature differences for each phase (oil, water, and gas) are caused by frictional effects. Therefore, in developing a thermal model for horizontal wellbore, subtle temperature changes must be accounted for. In this project, we have rigorously derived governing equations for a producing horizontal wellbore and developed a prediction model of the temperature and pressure by coupling the wellbore and reservoir equations. Also, we applied Ramey's model (1962) to the build section and used an energy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases at varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section. With the prediction models developed, we present inversion studies of synthetic and field examples. These results are essential to identify water or gas entry, to guide flow control devices in intelligent completions, and to decide if reservoir stimulation is needed in particular horizontal sections. This study will complete and validate these inversion studies.

  1. Adsorption of ethanol onto activated carbon: Modeling and consequent interpretations based on statistical physics treatment

    NASA Astrophysics Data System (ADS)

    Bouzid, Mohamed; Sellaoui, Lotfi; Khalfaoui, Mohamed; Belmabrouk, Hafedh; Lamine, Abdelmottaleb Ben

    2016-02-01

    In this work, we studied the adsorption of ethanol on three types of activated carbon, namely parent Maxsorb III and two chemically modified activated carbons (H2-Maxsorb III and KOH-H2-Maxsorb III). This investigation has been conducted on the basis of the grand canonical formalism in statistical physics and on simplified assumptions. This led to three parameter equations describing the adsorption of ethanol onto the three types of activated carbon. There was a good correlation between experimental data and results obtained by the new proposed equation. The parameters characterizing the adsorption isotherm were the number of adsorbed molecules (s) per site n, the density of the receptor sites per unit mass of the adsorbent Nm, and the energetic parameter p1/2. They were estimated for the studied systems by a non linear least square regression. The results show that the ethanol molecules were adsorbed in perpendicular (or non parallel) position to the adsorbent surface. The magnitude of the calculated adsorption energies reveals that ethanol is physisorbed onto activated carbon. Both van der Waals and hydrogen interactions were involved in the adsorption process. The calculated values of the specific surface AS, proved that the three types of activated carbon have a highly microporous surface.

  2. Hysteresis model and statistical interpretation of energy losses in non-oriented steels

    NASA Astrophysics Data System (ADS)

    Mănescu (Păltânea), Veronica; Păltânea, Gheorghe; Gavrilă, Horia

    2016-04-01

    In this paper the hysteresis energy losses in two non-oriented industrial steels (M400-65A and M800-65A) were determined, by means of an efficient classical Preisach model, which is based on the Pescetti-Biorci method for the identification of the Preisach density. The excess and the total energy losses were also determined, using a statistical framework, based on magnetic object theory. The hysteresis energy losses, in a non-oriented steel alloy, depend on the peak magnetic polarization and they can be computed using a Preisach model, due to the fact that in these materials there is a direct link between the elementary rectangular loops and the discontinuous character of the magnetization process (Barkhausen jumps). To determine the Preisach density it was necessary to measure the normal magnetization curve and the saturation hysteresis cycle. A system of equations was deduced and the Preisach density was calculated for a magnetic polarization of 1.5 T; then the hysteresis cycle was reconstructed. Using the same pattern for the Preisach distribution, it was computed the hysteresis cycle for 1 T. The classical losses were calculated using a well known formula and the excess energy losses were determined by means of the magnetic object theory. The total energy losses were mathematically reconstructed and compared with those, measured experimentally.

  3. Statistical interpretation of the impact of forest growth on streamflow of the Sameura basin, Japan.

    PubMed

    Yue, Sheng; Hashino, Michio

    2005-05-01

    A forested mountainous basin, the Sameura basin, located in Shikoku Island of Japan, experienced increased forest growth in the period from 1953 to 1994, like which occurred across the country. The impact of the forest growth on streamflow of the basin was assessed using statistical trend analysis. Annual maximum daily flow, annual minimum 5-day flow, and annual total runoff decreased by 55.8, 75.8, and 39.6%, respectively, over the period. However, the annual maximum 6-day, annual minimum 41-day, and annual total precipitation, respectively associated with annual maximum daily flow, annual minimum 5-day streamflow, and annual total runoff did not decrease. Annual and monthly temperature, which evapotranspiration positively related to, did not increase except in January. This demonstrates that the forest growth is responsible for the decrease in all these three flow regimes. The increase in evapotranspiration due to the forest growth resulted in the decrease in both total runoff and low flow. Thus, it seems that forest can hardly function to both reduce flood peaks during flood periods and increase water supply during drought periods. PMID:15931997

  4. Chemical and statistical interpretation of sized aerosol particles collected at an urban site in Thessaloniki, Greece.

    PubMed

    Tsitouridou, Roxani; Papazova, Petia; Simeonova, Pavlina; Simeonov, Vasil

    2013-01-01

    The size distribution of aerosol particles (PM0.015-PM18) in relation to their soluble inorganic species and total water soluble organic compounds (WSOC) was investigated at an urban site of Thessaloniki, Northern Greece. The sampling period was from February to July 2007. The determined compounds were compared with mass concentrations of the PM fractions for nano (N: 0.015 < Dp < 0.06), ultrafine (UFP: 0.015 < Dp < 0.125), fine (FP: 0.015 < Dp < 2.0) and coarse particles (CP: 2.0 < Dp < 8.0) in order to perform mass closure of the water soluble content for the respective fractions. Electrolytes were the dominant species in all fractions (24-27%), followed by WSOC (16-23%). The water soluble inorganic and organic content was found to account for 53% of the nanoparticle, 48% of the ultrafine particle, 45% of the fine particle and 44% of the coarse particle mass. Correlations between the analyzed species were performed and the effect of local and long-range transported emissions was examined by wind direction and backward air mass trajectories. Multivariate statistical analysis (cluster analysis and principal components analysis) of the collected data was performed in order to reveal the specific data structure. Possible sources of air pollution were identified and an attempt is made to find patterns of similarity between the different sized aerosols and the seasons of monitoring. It was proven that several major latent factors are responsible for the data structure despite the size of the aerosols - mineral (soil) dust, sea sprays, secondary emissions, combustion sources and industrial impact. The seasonal separation proved to be not very specific. PMID:24007436

  5. An Easily Constructed Trigonal Prism Model.

    ERIC Educational Resources Information Center

    Yamana, Shukichi

    1984-01-01

    A model of a trigonal prism which is useful for teaching stereochemistry (especially of the neodymium enneahydrate ion), can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  6. Weighted feature significance: a simple, interpretable model of compound toxicity based on the statistical enrichment of structural features.

    PubMed

    Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R; Austin, Christopher P

    2009-12-01

    In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high-throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation. PMID:19805409

  7. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  8. Acquire CoOmmodities Easily Card

    Energy Science and Technology Software Center (ESTSC)

    1998-05-29

    Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for helpmore » with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.« less

  9. Acquire CoOmmodities Easily Card

    SciTech Connect

    Soler, E. E.

    1998-05-29

    Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.

  10. ACECARD. Acquire CoOmmodities Easily Card

    SciTech Connect

    Soler, E.E.

    1996-09-01

    Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.

  11. Compositionality and Statistics in Adjective Acquisition: 4-Year-Olds Interpret "Tall" and "Short" Based on the Size Distributions of Novel Noun Referents

    ERIC Educational Resources Information Center

    Barner, David; Snedeker, Jesse

    2008-01-01

    Four experiments investigated 4-year-olds' understanding of adjective-noun compositionality and their sensitivity to statistics when interpreting scalar adjectives. In Experiments 1 and 2, children selected "tall" and "short" items from 9 novel objects called "pimwits" (1-9 in. in height) or from this array plus 4 taller or shorter distractor…

  12. Easily available enzymes as natural retting agents.

    PubMed

    Antonov, Viktor; Marek, Jan; Bjelkova, Marie; Smirous, Prokop; Fischer, Holger

    2007-03-01

    Easily available commercial enzymes currently have great potential in bast fibre processing and can be modified for different end uses. There are several new technologies using enzymes that are able to modify fibre parameters, achieve requested properties, improve processing results and are more beneficial to the ecology in the area of bast fibre processing and fabrics finishing. Enzymatic methods for retting of flax, "cottonisation" of bast fibres, hemp separation, and processing of flax rovings before wet spinning, etc., fall into this group of new technologies. Such enzymatic biotechnologies can provide benefits in textile, composite, reinforced plastic and other technical applications. Laboratory, pilot and industrial scale results and experiences have demonstrated the ability of selected enzymes to decompose interfibre-bonding layers based on pectin, lignin and hemicelluloses. Texazym SER spray is able to increase flax long fibre yields by more than 40%. Other enzymes in combination with mild mechanical treatment can replace aggressive and energy-intensive processing like Laroche "cottonisation". Texazym SCW and DLG pretreatments of flax rovings are presented. PMID:17309044

  13. Quantum of area {Delta}A=8{pi}l{sub P}{sup 2} and a statistical interpretation of black hole entropy

    SciTech Connect

    Ropotenko, Kostiantyn

    2010-08-15

    In contrast to alternative values, the quantum of area {Delta}A=8{pi}l{sub P}{sup 2} does not follow from the usual statistical interpretation of black hole entropy; on the contrary, a statistical interpretation follows from it. This interpretation is based on the two concepts: nonadditivity of black hole entropy and Landau quantization. Using nonadditivity a microcanonical distribution for a black hole is found and it is shown that the statistical weight of a black hole should be proportional to its area. By analogy with conventional Landau quantization, it is shown that quantization of a black hole is nothing but the Landau quantization. The Landau levels of a black hole and their degeneracy are found. The degree of degeneracy is equal to the number of ways to distribute a patch of area 8{pi}l{sub P}{sup 2} over the horizon. Taking into account these results, it is argued that the black hole entropy should be of the form S{sub bh}=2{pi}{center_dot}{Delta}{Gamma}, where the number of microstates is {Delta}{Gamma}=A/8{pi}l{sub P}{sup 2}. The nature of the degrees of freedom responsible for black hole entropy is elucidated. The applications of the new interpretation are presented. The effect of noncommuting coordinates is discussed.

  14. A Note on the Calculation and Interpretation of the Delta-p Statistic for Categorical Independent Variables

    ERIC Educational Resources Information Center

    Cruce, Ty M.

    2009-01-01

    This methodological note illustrates how a commonly used calculation of the Delta-p statistic is inappropriate for categorical independent variables, and this note provides users of logistic regression with a revised calculation of the Delta-p statistic that is more meaningful when studying the differences in the predicted probability of an…

  15. Statistical factor analysis technique for characterizing basalt through interpreting nuclear and electrical well logging data (case study from Southern Syria).

    PubMed

    Asfahani, Jamal

    2014-02-01

    Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted. PMID:24296157

  16. Collegiate Enrollments in the U.S., 1979-80. Statistics, Interpretations, and Trends in 4-Year and Related Institutions.

    ERIC Educational Resources Information Center

    Mickler, J. Ernest

    This 60th annual report on collegiate enrollments in the United States is based on data received from 1,635 four-year institutions in the U.S., Puerto Rico, and the U.S. Territories. General notes, survey methodology notes, and a summary of findings are presented. Detailed statistical charts present institutional data on men and women students and…

  17. Analysis of the procedures used to evaluate suicide crime scenes in Brazil: a statistical approach to interpret reports.

    PubMed

    Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira

    2014-08-01

    This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. PMID:25066170

  18. Skew-laplace and cell-size distribution in microbial axenic cultures: statistical assessment and biological interpretation.

    PubMed

    Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S; Vives-Rego, Josep

    2010-01-01

    We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754

  19. Skew-Laplace and Cell-Size Distribution in Microbial Axenic Cultures: Statistical Assessment and Biological Interpretation

    PubMed Central

    Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S.; Vives-Rego, Josep

    2010-01-01

    We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754

  20. Chemical data and statistical interpretations for rocks and ores from the Ranger uranium mine, Northern Territory, Australia

    USGS Publications Warehouse

    Nash, J. Thomas; Frishman, David

    1983-01-01

    Analytical results for 61 elements in 370 samples from the Ranger Mine area are reported. Most of the rocks come from drill core in the Ranger No. 1 and Ranger No. 3 deposits, but 20 samples are from unmineralized drill core more than 1 km from ore. Statistical tests show that the elements Mg, Fe, F, Be, Co, Li, Ni, Pb, Sc, Th, Ti, V, CI, As, Br, Au, Ce, Dy, La Sc, Eu, Tb, Yb, and Tb have positive association with uranium, and Si, Ca, Na, K, Sr, Ba, Ce, and Cs have negative association. For most lithologic subsets Mg, Fe, Li, Cr, Ni, Pb, V, Y, Sm, Sc, Eu, and Yb are significantly enriched in ore-bearing rocks, whereas Ca, Na, K, Sr, Ba, Mn, Ce, and Cs are significantly depleted. These results are consistent with petrographic observations on altered rocks. Lithogeochemistry can aid exploration, but for these rocks requires methods that are expensive and not amenable to routine use.

  1. Hydrochemical and multivariate statistical interpretations of spatial controls of nitrate concentrations in a shallow alluvial aquifer around oxbow lakes (Osong area, central Korea)

    NASA Astrophysics Data System (ADS)

    Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo

    2009-07-01

    Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean = 35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets.

  2. Hydrochemical and multivariate statistical interpretations of spatial controls of nitrate concentrations in a shallow alluvial aquifer around oxbow lakes (Osong area, central Korea).

    PubMed

    Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo

    2009-07-21

    Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean=35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets. PMID:19524319

  3. Proper interpretation of chronic toxicity studies and their statistics: A critique of "Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example".

    PubMed

    Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol

    2015-09-01

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. PMID:25261588

  4. Two and three photon dissociation of SbBr3 and a statistical interpretation of the fragmentation

    NASA Astrophysics Data System (ADS)

    Haunert, G.; Tiemann, E.

    1995-12-01

    UV two and three photon dissociation of SbBr3 in the gas phase is studied by monitoring the emission spectra of the resulting excited atomic fragment Sb by means of an optical multichannel analyzer (OMA). The relative fluorescence intensities arising from different atomic states allow us to calculate the population of Sb* states produced upon photodissociation by a frequency doubled tunable pulsed dye laser. For the range of 219 249 nm of the dissociation wavelength the analysis shows a statistical distribution of the population of excited Sb states (43000 58000 cm-1) which can be described by only one parameter called “temperature”. The derived temperature will be discussed for the dependence on the excitation wavelength and laser flux. The temperature does not increase continuously with photon energy of dissociation. A sudden drop in the temperature photon-energy diagram can be related to a change from a three to a two photon-dissociation process of SbBr3.

  5. Multivariate Statistical Analysis as a Supplementary Tool for Interpretation of Variations in Salivary Cortisol Level in Women with Major Depressive Disorder

    PubMed Central

    Dziurkowska, Ewelina; Wesolowski, Marek

    2015-01-01

    Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods. PMID:26380376

  6. Pulsar statistics and their interpretations

    NASA Technical Reports Server (NTRS)

    Arnett, W. D.; Lerche, I.

    1981-01-01

    It is shown that a lack of knowledge concerning interstellar electron density, the true spatial distribution of pulsars, the radio luminosity source distribution of pulsars, the real ages and real aging rates of pulsars, the beaming factor (and other unknown factors causing the known sample of about 350 pulsars to be incomplete to an unknown degree) is sufficient to cause a minimum uncertainty of a factor of 20 in any attempt to determine pulsar birth or death rates in the Galaxy. It is suggested that this uncertainty must impact on suggestions that the pulsar rates can be used to constrain possible scenarios for neutron star formation and stellar evolution in general.

  7. Easily constructed mini-sextant demonstrates optical principles

    NASA Astrophysics Data System (ADS)

    Nenninger, Garet G.

    2000-04-01

    An easily constructed optical instrument for measuring the angle between the Sun and the horizon is described. The miniature sextant relies on multiple reflections to produce multiple images of the sun at fixed angles away from the true Sun.

  8. An Easily Constructed Model of a Square Antiprism.

    ERIC Educational Resources Information Center

    Yamana, Shukichi

    1984-01-01

    A model of a square antiprism which is necessary for teaching stereochemistry (for example, of the octafluorotantalate ion) can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  9. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    PubMed

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization both in 2-D and 1-D was undertaken. Results suggest that the iCFD model developed for the SST through the proposed methodology is able to predict solid distribution with high accuracy - taking a reasonable computational effort - when compared to multi-dimensional numerical experiments, under a wide range of flow and design conditions. iCFD tools could play a crucial role in reliably predicting systems' performance under normal and shock events. PMID:26248321

  10. Easily disassembled electrical connector for high voltage, high frequency connections

    DOEpatents

    Milner, J.R.

    1994-05-10

    An easily accessible electrical connector capable of rapid assembly and disassembly is described wherein a wide metal conductor sheet may be evenly contacted over the entire width of the conductor sheet by opposing surfaces on the connector which provide an even clamping pressure against opposite surfaces of the metal conductor sheet using a single threaded actuating screw. 13 figures.

  11. Easily disassembled electrical connector for high voltage, high frequency connections

    DOEpatents

    Milner, Joseph R.

    1994-01-01

    An easily accessible electrical connector capable of rapid assembly and disassembly wherein a wide metal conductor sheet may be evenly contacted over the entire width of the conductor sheet by opposing surfaces on the connector which provide an even clamping pressure against opposite surfaces of the metal conductor sheet using a single threaded actuating screw.

  12. An assessment of approximating aspheres with more easily manufactured surfaces.

    PubMed

    Howells, M R; Anspach, J; Bender, J

    1998-05-01

    In designing optical systems for synchrotron radiation, one is often led to conclude that optimal performance can be obtained from optical surfaces described by conic sections of revolution, usually paraboloids and ellipsoids. The resulting design can lead to prescriptions for three-dimensional optical surfaces that are difficult to fabricate accurately. Under some circumstances satisfactory system performance can be achieved through the use of more easily manufactured surfaces such as cylinders, cones, bent cones, toroids and elliptical cylinders. These surfaces often have the additional benefits of scalability to large aperture, lower surface roughness and improved surface figure accuracy. In this paper we explore some of the conditions under which these more easily manufactured surfaces can be utilized without sacrificing performance. PMID:15263662

  13. Combining data visualization and statistical approaches for interpreting measurements and meta-data: Integrating heatmaps, variable clustering, and mixed regression models

    EPA Science Inventory

    The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...

  14. Between Teacher & Parent: Helping the Child Who Cries Easily

    ERIC Educational Resources Information Center

    Brodkin, Adele M.

    2004-01-01

    Parents need to remember that crying is the first method of communication for children younger than 5 or 6. It is their way of getting attention. While it isn't easy for new parents to interpret their baby's cries, most learn to distinguish the "I am hungry--feed me" cry from the "My tummy hurts" or the "I am just fussy and bored" cry. This

  15. Between Teacher & Parent: Helping the Child Who Cries Easily

    ERIC Educational Resources Information Center

    Brodkin, Adele M.

    2004-01-01

    Parents need to remember that crying is the first method of communication for children younger than 5 or 6. It is their way of getting attention. While it isn't easy for new parents to interpret their baby's cries, most learn to distinguish the "I am hungry--feed me" cry from the "My tummy hurts" or the "I am just fussy and bored" cry. This…

  16. Plasmonic Films Can Easily Be Better: Rules and Recipes

    PubMed Central

    2015-01-01

    High-quality materials are critical for advances in plasmonics, especially as researchers now investigate quantum effects at the limit of single surface plasmons or exploit ultraviolet- or CMOS-compatible metals such as aluminum or copper. Unfortunately, due to inexperience with deposition methods, many plasmonics researchers deposit metals under the wrong conditions, severely limiting performance unnecessarily. This is then compounded as others follow their published procedures. In this perspective, we describe simple rules collected from the surface-science literature that allow high-quality plasmonic films of aluminum, copper, gold, and silver to be easily deposited with commonly available equipment (a thermal evaporator). Recipes are also provided so that films with optimal optical properties can be routinely obtained. PMID:25950012

  17. Triazolophthalazines: Easily Accessible Compounds with Potent Antitubercular Activity.

    PubMed

    Veau, Damien; Krykun, Serhii; Mori, Giorgia; Orena, Beatrice S; Pasca, Maria R; Frongia, Céline; Lobjois, Valérie; Chassaing, Stefan; Lherbet, Christian; Baltas, Michel

    2016-05-19

    Tuberculosis (TB) remains one of the major causes of death worldwide, in particular because of the emergence of multidrug-resistant TB. Herein we explored the potential of an alternative class of molecules as anti-TB agents. Thus, a series of novel 3-substituted triazolophthalazines was quickly and easily prepared from commercial hydralazine hydrochloride as starting material and were further evaluated for their antimycobacterial activities and cytotoxicities. Four of the synthesized compounds were found to effectively inhibit the Mycobacterium tuberculosis (M.tb) H37 Rv strain with minimum inhibitory concentration (MIC) values <10 μg mL(-1) , whereas no compounds displayed cytotoxicity against HCT116 human cell lines (IC50 >100 μm). More remarkably, the most potent compounds proved to be active to a similar extent against various multidrug-resistant M.tb strains, thus uncovering a mode of action distinct from that of standard antitubercular agents. Overall, their ease of preparation, combined with their attractive antimycobacterial activities, make such triazolophthalazine-based derivatives promising leads for further development. PMID:27097919

  18. An easily constructed, very inexpensive, solar cell transmissiometer (SCT)

    SciTech Connect

    Knowles, S.C.; Wells, J.T.

    1998-01-01

    Suspended sediment concentration (SSC), one of the standard measures of water quality in aquatic systems, is of widespread interest because of its role in particulate flux and light attenuation. The conventional method of sampling and filtration to determine SSC is rather slow and cumbersome, and does not allow resolution of high-frequency variations. In contrast, the beam transmissometer has been used as a convenient way to obtain time-series estimates of SSC in environments ranging from the deep ocean to the continental shelf and estuaries. In situ observation of suspended particles in fluvial, estuarine, lagoonal, and inner-shelf environments has in fact revealed that most of the volume of material in suspension exists as aggregates (> 50--100 {micro}m diameter) of smaller components and that aggregate properties may change on time scales of only a few minutes and length scales of less than one meter. Here, the authors describe a very inexpensive, easily constructed solar cell transmissometer (SCT) that has been developed for use with an in situ suspended-sediment photography system. Conventional optical SSC sensors can provide excellent predictive capability when deployed during conditions of relatively uniform aggregate characteristics. However, this system may provide better results than commercially available systems because it disaggregates suspended material prior to measurement of light attenuation, thereby reducing the effects of aggregation and changes in aggregate characteristics, which can occur over very short temporal and spatial scales. This paper describes the solar cell transmissometer and gives results of laboratory calibration, design, construction, and field testing alongside an optical backscatter sensor (OBS).

  19. Metview and VAPOR: Exploring ECMWF forecasts easily in four dimensions

    NASA Astrophysics Data System (ADS)

    Siemen, Stephan; Kertesz, Sandor; Carver, Glenn

    2014-05-01

    The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member states and co-operating states with forecasts in the medium time range of up to 15 days as well as other forcasts and analysis. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast. Users can choose to explore ECMWF's forecasts from the web or through visualisation tools installed locally or at ECMWF. ECMWF also develops in co-operation with INPE, Brazil, the Metview meteorological workstation and batch system. Metview enables users to easily analyse and visualise forecasts, and is routinely used by scientists and forecasters at ECMWF and other institutions. While Metview offers high quality visualisation in two-dimensional plots and animations, it uses external tools to visualise data in four dimensions. VAPOR is the Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers. VAPOR provides an interactive 3D visualisation environment that runs on most UNIX and Windows systems equipped with modern 3D graphics cards. VAPOR development is led by the National Center for Atmospheric Research's Scientific Computing Division in collaboration with U.C. Davis and Ohio State University. In this paper we will give an overview of how users, with Metview and access to ECMWF's archive, can visualise forecast data in four dimensions within VAPOR. The process of preparing the data in Metview is the key step and described in detail. The benefits to researchers are highlighted with a case study analysing a given weather scenario.

  20. Making large amounts of meteorological plots easily accessible to users

    NASA Astrophysics Data System (ADS)

    Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin

    2015-04-01

    The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member organisations with forecasts in the medium time range of 3 to 15 days, and some longer-range forecasts for up to a year ahead, with varying degrees of detail. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast, where some specific processing and visualisation are applied to extract information. Every day, thousands of raw data are being pushed to the ECMWF's interactive web charts application called ecCharts, and thousands of products are processed and pushed to ECMWF's institutional web site ecCharts provides a highly interactive application to display and manipulate recent numerical forecasts to forecasters in national weather services and ECMWF's commercial customers. With ecCharts forecasters are able to explore ECMWF's medium-range forecasts in far greater detail than has previously been possible on the web, and this as soon as the forecast becomes available. All ecCharts's products are also available through a machine-to-machine web map service based on the OGC Web Map Service (WMS) standard. ECMWF institutional web site provides access to a large number of graphical products. It was entirely redesigned last year. It now shares the same infrastructure as ECMWF's ecCharts, and can benefit of some ecCharts functionalities, for example the dashboard. The dashboard initially developed for ecCharts allows users to organise their own collection of products depending on their work flow, and is being further developed. In its first implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.

  1. Statistics Clinic

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  2. Enhancing the interpretation of statistical P values in toxicology studies: implementation of linear mixed models (LMMs) and standardized effect sizes (SESs).

    PubMed

    Schmidt, Kerstin; Schmidtke, Jörg; Kohl, Christian; Wilhelm, Ralf; Schiemann, Joachim; van der Voet, Hilko; Steinberg, Pablo

    2016-03-01

    In this paper, we compare the traditional ANOVA approach to analysing data from 90-day toxicity studies with a more modern LMM approach, and we investigate the use of standardized effect sizes. The LMM approach is used to analyse weight or feed consumption data. When compared to the week-by-week ANOVA with multiple test results per week, this approach results in only one statement on differences in weight development between groups. Standardized effect sizes are calculated for the endpoints: weight, relative organ weights, haematology and clinical biochemistry. The endpoints are standardized, allowing different endpoints of the same study to be compared and providing an overall picture of group differences at a glance. Furthermore, in terms of standardized effect sizes, statistical significance and biological relevance are displayed simultaneously in a graph. PMID:25724152

  3. Easily Installable Wireless Behavioral Monitoring System with Electric Field Sensor for Ordinary Houses

    PubMed Central

    Tsukamoto, S; Hoshino, H; Tamura, T

    2008-01-01

    This paper describes an indoor behavioral monitoring system for improving the quality of life in ordinary houses. It employs a device that uses weak radio waves for transmitting the obtained data and it is designed such that it can be installed by a user without requiring any technical knowledge or extra constructions. This study focuses on determining the usage statistics of home electric appliances by using an electromagnetic field sensor as a detection device. The usage of the home appliances is determined by measuring the electromagnetic field that can be observed in an area near the appliance. It is assumed that these usage statistics could provide information regarding the indoor behavior of a subject. Since the sensor is not direction sensitive and does not require precise positioning and wiring, it can be easily installed in ordinary houses by the end users. For evaluating the practicability of the sensor unit, several simple tests have been performed. The results indicate that the proposed system could be useful for collecting the usage statistics of home appliances. PMID:19415135

  4. Statistical treatment and preliminary interpretation of chemical data from a uranium deposit in the northeast part of the Church Rock area, Gallup mining district, New Mexico

    USGS Publications Warehouse

    Spirakis, C.S.; Pierson, C.T.; Santos, E.S.; Fishman, N.S.

    1983-01-01

    Statistical treatment of analytical data from 106 samples of uranium-mineralized and unmineralized or weakly mineralized rocks of the Morrison Formation from the northeastern part of the Church Rock area of the Grants uranium region indicates that along with uranium, the deposits in the northeast Church Rock area are enriched in barium, sulfur, sodium, vanadium and equivalent uranium. Selenium and molybdenum are sporadically enriched in the deposits and calcium, manganese, strontium, and yttrium are depleted. Unlike the primary deposits of the San Juan Basin, the deposits in the northeast part of the Church Rock area contain little organic carbon and several elements that are characteristically enriched in the primary deposits are not enriched or are enriched to a much lesser degree in the Church Rock deposits. The suite of elements associated with the deposits in the northeast part of the Church Rock area is also different from the suite of elements associated with the redistributed deposits in the Ambrosia Lake district. This suggests that the genesis of the Church Rock deposits is different, at least in part, from the genesis of the primary deposits of the San Juan Basin or the redistributed deposits at Ambrosia Lake.

  5. Palaeomagnetic analysis on pottery as indicator of the pyroclastic flow deposits temperature: new data and statistical interpretation from the Minoan eruption of Santorini, Greece

    NASA Astrophysics Data System (ADS)

    Tema, E.; Zanella, E.; Pavón-Carrasco, F. J.; Kondopoulou, D.; Pavlides, S.

    2015-10-01

    We present the results of palaeomagnetic analysis on Late Bronge Age pottery from Santorini carried out in order to estimate the thermal effect of the Minoan eruption on the pre-Minoan habitation level. A total of 170 specimens from 108 ceramic fragments have been studied. The ceramics were collected from the surface of the pre-Minoan palaeosol at six different sites, including also samples from the Akrotiri archaeological site. The deposition temperatures of the first pyroclastic products have been estimated by the maximum overlap of the re-heating temperature intervals given by the individual fragments at site level. A new statistical elaboration of the temperature data has also been proposed, calculating at 95 per cent of probability the re-heating temperatures at each site. The obtained results show that the precursor tephra layer and the first pumice fall of the eruption were hot enough to re-heat the underlying ceramics at temperatures 160-230 °C in the non-inhabited sites while the temperatures recorded inside the Akrotiri village are slightly lower, varying from 130 to 200 °C. The decrease of the temperatures registered in the human settlements suggests that there was some interaction between the buildings and the pumice fallout deposits while probably the buildings debris layer caused by the preceding and syn-eruption earthquakes has also contributed to the decrease of the recorded re-heating temperatures.

  6. Statistical Analysis and Interpretation of Building Characterization, Indoor Environmental Quality Monitoring and Energy Usage Data from Office Buildings and Classrooms in the United States

    SciTech Connect

    Linda Stetzenbach; Lauren Nemnich; Davor Novosel

    2009-08-31

    Three independent tasks had been performed (Stetzenbach 2008, Stetzenbach 2008b, Stetzenbach 2009) to measure a variety of parameters in normative buildings across the United States. For each of these tasks 10 buildings were selected as normative indoor environments. Task 1 focused on office buildings, Task 13 focused on public schools, and Task 0606 focused on high performance buildings. To perform this task it was necessary to restructure the database for the Indoor Environmental Quality (IEQ) data and the Sound measurement as several issues were identified and resolved prior to and during the transfer of these data sets into SPSS. During overview discussions with the statistician utilized in this task it was determined that because the selection of indoor zones (1-6) was independently selected within each task; zones were not related by location across tasks. Therefore, no comparison would be valid across zones for the 30 buildings so the by location (zone) data were limited to three analysis sets of the buildings within each task. In addition, differences in collection procedures for lighting were used in Task 0606 as compared to Tasks 01 & 13 to improve sample collection. Therefore, these data sets could not be merged and compared so effects by-day data were run separately for Task 0606 and only Task 01 & 13 data were merged. Results of the statistical analysis of the IEQ parameters show statistically significant differences were found among days and zones for all tasks, although no differences were found by-day for Draft Rate data from Task 0606 (p>0.05). Thursday measurements of IEQ parameters were significantly different from Tuesday, and most Wednesday measures for all variables of Tasks 1 & 13. Data for all three days appeared to vary for Operative Temperature, whereas only Tuesday and Thursday differed for Draft Rate 1m. Although no Draft Rate measures within Task 0606 were found to significantly differ by-day, Temperature measurements for Tuesday and Thursday showed variation. Moreover, Wednesday measurements of Relative Humidity within Task 0606 varied significantly from either Tuesday or Thursday. The majority of differences in IEQ measurements by-zone were highly significant (p<0.001), with the exception of Relative Humidity in some buildings. When all task data were combined (30 buildings) neither the airborne culturable fungi nor the airborne non-culturable spore data differed in the concentrations found at any indoor location in terms of day of collection. However, the concentrations of surface-associated fungi varied among the day of collection. Specifically, there was a lower concentration of mold on Tuesday than on Wednesday, for all tasks combined. As expected, variation was found in the concentrations of both airborne culturable fungi and airborne non-culturable fungal spores between indoor zones (1-6) and the outdoor zone (zone 0). No variation was found among the indoor zones of office buildings for Task 1 in the concentrations of airborne culturable fungi. However, airborne non-culturable spores did vary among zones in one building in Task 1 and variation was noted between zones in surface-associated fungi. Due to the lack of multiple lighting measurements for Tasks 13 and 0606, by-day comparisons were only performed for Task 1. No statistical differences were observed in lighting with respect to the day of collection. There was a wide range of variability by-zone among seven of the office buildings. Although few differences were found for the brightest illumination of the worksurface (IllumWkSfcBrtst) and the darkest illumination of the worksurface (IllumWkSfcDrkst) in Task 1, there was considerable variation for these variables in Task 13 and Task 0606 (p < 0.001). Other variables that differed by-zone in Task 13 include CombCCT and AmbCCT1 for S03, S07, and S08. Additionally, AmbChromX1, CombChromY, and CombChromX varied by-zone for school buildings S02, S04, and S05, respectively. Although all tasks demonstrated significant differences in sound measurements by zone, some of the buildings within each task did not appear to differ in sound quality. Hence, post-hoc tests were not appropriate and individual zones were not compared for these buildings. It is interesting to note that sound measurements in some buildings were widely varied with most zone comparisons and other buildings varied between only a few zones.

  7. Landslides triggered by the 12 January 2010 Port-au-Prince, Haiti, Mw = 7.0 earthquake: visual interpretation, inventory compiling, and spatial distribution statistical analysis

    NASA Astrophysics Data System (ADS)

    Xu, C.; Shyu, J. B. H.; Xu, X.

    2014-07-01

    The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw= 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and the thicknesses of their erosion with topographic, geologic, and seismic parameters. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolution satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various environmental parameters. These parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several previous publications. Therefore, we carried out comparisons of inventories of landslides triggered by the Haiti earthquake with other published results and proposed possible reasons for any differences. We suggest that the empirical functions between earthquake magnitude and co-seismic landslides need to be updated on the basis of the abundant and more complete co-seismic landslide inventories recently available.

  8. Landslides triggered by the 12 January 2010 Mw 7.0 Port-au-Prince, Haiti, earthquake: visual interpretation, inventory compiling and spatial distribution statistical analysis

    NASA Astrophysics Data System (ADS)

    Xu, C.; Shyu, J. B. H.; Xu, X.-W.

    2014-02-01

    The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and their erosion thicknesses with topographic factors, seismic parameters, and their distance from roads. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolutions satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various landslide controlling parameters. These controlling parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several previous publications. Therefore, we carried out comparisons of inventories of landslides triggered by the Haiti earthquake with other published results and proposed possible reasons of any differences. We suggest that the empirical functions between earthquake magnitude and co-seismic landslides need to update on the basis of the abundant and more complete co-seismic landslide inventories recently available.

  9. Long-Term Amorphous Drug Stability Predictions Using Easily Calculated, Predicted, and Measured Parameters.

    PubMed

    Nurzyńska, Katarzyna; Booth, Jonathan; Roberts, Clive J; McCabe, James; Dryden, Ian; Fischer, Peter M

    2015-09-01

    The purpose of this study was to develop a predictive model of the amorphous stability of drugs with particular relevance for poorly water-soluble compounds. Twenty-five representative neutral poorly soluble compounds with a diverse range of physicochemical properties and chemical structures were systematically selected from an extensive library of marketed drug products. The physical stability of the amorphous form, measured over a 6 month period by the onset of crystallization of amorphous films prepared by melting and quench-cooling, was assessed using polarized light microscopy. The data were used as a response variable in a statistical model with calculated/predicted or measured molecular, thermodynamic, and kinetic parameters as explanatory variables. Several multiple linear regression models were derived, with varying balance between calculated/predicted and measured parameters. It was shown that inclusion of measured parameters significantly improves the predictive ability of the model. The best model demonstrated a prediction accuracy of 82% and included the following as parameters: melting and glass transition temperatures, enthalpy of fusion, configurational free energy, relaxation time, number of hydrogen bond donors, lipophilicity, and the ratio of carbon to heteroatoms. Good predictions were also obtained with a simpler model, which was comprised of easily acquired quantities: molecular weight and enthalpy of fusion. Statistical models are proposed to predict long-term amorphous drug stability. The models include readily accessible parameters, which are potentially the key factors influencing amorphous stability. The derived models can support faster decision making in drug formulation development. PMID:26236939

  10. Localized Smart-Interpretation

    NASA Astrophysics Data System (ADS)

    Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom

    2014-05-01

    The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f(d,m) successfully has been inferred, we are able to simulate how the geological expert would perform an interpretation given some external information m, through f(d|m). We will demonstrate this method applied on geological interpretation and densely sampled airborne electromagnetic data. In short, our goal is to build a statistical model describing how a geological expert performs geological interpretation given some geophysical data. We then wish to use this statistical model to perform semi automatic interpretation, everywhere where such geophysical data exist, in a manner consistent with the choices made by a geological expert. Benefits of such a statistical model are that 1. it provides a quantification of how a geological expert performs interpretation based on available diverse data 2. all available geophysical information can be used 3. it allows much faster interpretation of large data sets.

  11. Interpretive Experiments

    ERIC Educational Resources Information Center

    DeHaan, Frank, Ed.

    1977-01-01

    Describes an interpretative experiment involving the application of symmetry and temperature-dependent proton and fluorine nmr spectroscopy to the solution of structural and kinetic problems in coordination chemistry. (MLH)

  12. An Easily Accessible Web-Based Minimization Random Allocation System for Clinical Trials

    PubMed Central

    Xiao, Lan; Huang, Qiwen; Yank, Veronica

    2013-01-01

    Background Minimization as an adaptive allocation technique has been recommended in the literature for use in randomized clinical trials. However, it remains uncommonly used due in part to a lack of easily accessible implementation tools. Objective To provide clinical trialists with a robust, flexible, and readily accessible tool for implementing covariate-adaptive biased-coin randomization. Methods We developed a Web-based random allocation system, MinimRan, that applies Pocock–Simon (for trials with 2 or more arms) and 2-way (currently limited to 2-arm trials) minimization methods for trials using only categorical prognostic factors or the symmetric Kullback–Leibler divergence minimization method for trials (currently limited to 2-arm trials) using continuous prognostic factors with or without categorical factors, in covariate-adaptive biased-coin randomization. Results In this paper, we describe the system’s essential statistical and computer programming features and provide as an example the randomization results generated by it in a recently completed trial. The system can be used in single- and double-blind trials as well as single-center and multicenter trials. Conclusions We expect the system to facilitate the translation of the 3 validated random allocation methods into broad, efficient clinical research practice. PMID:23872035

  13. Interpreting Metonymy.

    ERIC Educational Resources Information Center

    Pankhurst, Anne

    1994-01-01

    This paper examines some of the problems associated with interpreting metonymy, a figure of speech in which an attribute or commonly associated feature is used to name or designate something. After defining metonymy and outlining the principles of metonymy, the paper explains the differences between metonymy, synecdoche, and metaphor. It is…

  14. Performing Interpretation

    ERIC Educational Resources Information Center

    Kothe, Elsa Lenz; Berard, Marie-France

    2013-01-01

    Utilizing a/r/tographic methodology to interrogate interpretive acts in museums, multiple areas of inquiry are raised in this paper, including: which knowledge is assigned the greatest value when preparing a gallery talk; what lies outside of disciplinary knowledge; how invitations to participate invite and disinvite in the same gesture; and what

  15. CAinterprTools: An R package to help interpreting Correspondence Analysis' results

    NASA Astrophysics Data System (ADS)

    Alberti, Gianmarco

    2015-09-01

    Correspondence Analysis (CA) is a statistical exploratory technique frequently used in many research fields to graphically visualize the structure of contingency tables. Many programs, both commercial and free, perform CA but none of them as yet provides a visual aid to the interpretation of the results. The 'CAinterprTools' package, designed to be used in the free R statistical environment, aims at filling that gap. A novel-to-medium R user has been considered as target. 15 commands enable to easily obtain charts that help (and are relevant to) the interpretation of the CA's results, freeing the user from the need to inspect and scrutinize tabular CA outputs, and to look up values and statistics on which further calculations are necessary. The package also implements tests to assess the significance of the input table's total inertia and individual dimensions.

  16. Interpretations of Entanglement

    NASA Astrophysics Data System (ADS)

    Jones, Martin

    2002-04-01

    The peculiar statistical correlations between spatially separated systems which arise in quantum mechanics, and which the Einstein-Podolsky-Rosen paper of 1935 thrust into the limelight, have been the focus of much interpretive speculation and disagreement in the years since then. Amongst the questions raised along the way have been questions about the possibility of superluminal causation, the limits of quantum mechanics and its relation to relativity theory, the nature of and need for causal explanation, realism, determinism, and the presence of holism in quantum mechanics. This talk will provide an historically structured overview of these debates including discussion of the Bohm theory, the many worlds interpretation, and more recent developments and will suggest a way of dividing many of the interpretations of entanglement into clusters of like-minded views.

  17. Time Resolved Thermal Diffusivity of Seasonal Snow Determined from Inexpensive, Easily-Implemented Temperature Measurements

    NASA Astrophysics Data System (ADS)

    Oldroyd, H. J.; Higgins, C. W.; Huwald, H.; Selker, J. S.; Parlange, M. B.

    2011-12-01

    Thermal diffusivity of snow is an important physical property associated with key hydrological phenomena such as snow melt and heat and water vapor exchange with the atmosphere. These phenomena have broad implications in studies of climate and heat and water budgets on many scales. However, direct measurements of snow thermal diffusivity require coupled point measurements of thermal conductivity and density, which are nonstationary due to snow metamorphism. Furthermore, thermal conductivity measurements are typically obtained with specialized heating probes or plates and snow density measurements require digging snow pits. Therefore, direct measurements are difficult to obtain with high enough temporal resolution such that direct comparisons with atmospheric conditions can be made. This study uses highly resolved (7.5 to 10 cm for depth and 1min for time) temperature measurements from the Plaine Morte glacier in Switzerland as initial and boundary conditions to numerically solve the 1D heat equation and iteratively optimize for thermal diffusivity. The method uses flux boundary conditions to constrain thermal diffusivity such that spuriously high values in thermal diffusivity are eliminated. Additionally, a t-test ensuring statistical significance between solutions of varied thermal diffusivity result in further constraints on thermal diffusivity that eliminate spuriously low values. The results show that time resolved (1 minute) thermal diffusivity can be determined from easily implemented and inexpensive temperature measurements of seasonal snow with good agreement to widely used parameterizations based on snow density. This high time resolution further affords the ability to explore possible turbulence-induced enhancements to heat and mass transfer in the snow.

  18. Interpretive Medicine

    PubMed Central

    Reeve, Joanne

    2010-01-01

    Patient-centredness is a core value of general practice; it is defined as the interpersonal processes that support the holistic care of individuals. To date, efforts to demonstrate their relationship to patient outcomes have been disappointing, whilst some studies suggest values may be more rhetoric than reality. Contextual issues influence the quality of patient-centred consultations, impacting on outcomes. The legitimate use of knowledge, or evidence, is a defining aspect of modern practice, and has implications for patient-centredness. Based on a critical review of the literature, on my own empirical research, and on reflections from my clinical practice, I critique current models of the use of knowledge in supporting individualised care. Evidence-Based Medicine (EBM), and its implementation within health policy as Scientific Bureaucratic Medicine (SBM), define best evidence in terms of an epistemological emphasis on scientific knowledge over clinical experience. It provides objective knowledge of disease, including quantitative estimates of the certainty of that knowledge. Whilst arguably appropriate for secondary care, involving episodic care of selected populations referred in for specialist diagnosis and treatment of disease, application to general practice can be questioned given the complex, dynamic and uncertain nature of much of the illness that is treated. I propose that general practice is better described by a model of Interpretive Medicine (IM): the critical, thoughtful, professional use of an appropriate range of knowledges in the dynamic, shared exploration and interpretation of individual illness experience, in order to support the creative capacity of individuals in maintaining their daily lives. Whilst the generation of interpreted knowledge is an essential part of daily general practice, the profession does not have an adequate framework by which this activity can be externally judged to have been done well. Drawing on theory related to the recognition of quality in interpretation and knowledge generation within the qualitative research field, I propose a framework by which to evaluate the quality of knowledge generated within generalist, interpretive clinical practice. I describe three priorities for research in developing this model further, which will strengthen and preserve core elements of the discipline of general practice, and thus promote and support the health needs of the public. PMID:21805819

  19. SLAR image interpretation keys for geographic analysis

    NASA Technical Reports Server (NTRS)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  20. CT Colonography: Pitfalls in Interpretation

    PubMed Central

    Pickhardt, Perry J.; Kim, David H.

    2012-01-01

    Synopsis As with any radiologic imaging test, there are a number of potential interpretive pitfalls at CT colonography (CTC) that need to be recognized and handled appropriately. Perhaps the single most important step in learning to avoid most of these diagnostic traps is simply to be aware of their existence. With a little experience, most of these potential pitfalls will be easily recognized. This review will systematically cover the key pitfalls confronting the radiologist at CTC interpretation, primarily dividing them into those related to technique and those related to underlying anatomy. Tips and pointers for how to effectively handle these potential pitfalls are included. PMID:23182508

  1. An Easily Constructed Model of Twin Octahedrons Having a Common Line.

    ERIC Educational Resources Information Center

    Yamana, Shukichi; Kawaguchi, Makoto

    1984-01-01

    A model of twin octahedrons having a common line which is useful for teaching stereochemistry (especially that of complex ions) can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  2. Quantum statistical determinism

    SciTech Connect

    Bitsakis, E.

    1988-03-01

    This paper attempts to analyze the concept of quantum statistical determinism. This is done after we have clarified the epistemic difference between causality and determinism and discussed the content of classical forms of determinism-mechanical and dynamical. Quantum statistical determinism transcends the classical forms, for it expresses the multiple potentialities of quantum systems. The whole argument is consistent with a statistical interpretation of quantum mechanics.

  3. LED champing: statistically blessed?

    PubMed

    Wang, Zhuo

    2015-06-10

    LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions. PMID:26192863

  4. Making On-Line Science Course Materials Easily Translatable and Accessible Worldwide: Challenges and Solutions

    ERIC Educational Resources Information Center

    Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.

    2012-01-01

    The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET…

  5. Gas-Phase Fragmentation of Oligoproline Peptide Ions Lacking Easily Mobilizable Protons

    NASA Astrophysics Data System (ADS)

    Rudowska, Magdalena; Wieczorek, Robert; Kluczyk, Alicja; Stefanowicz, Piotr; Szewczuk, Zbigniew

    2013-06-01

    The fragmentation of peptides containing quaternary ammonium group, but lacking easily mobilizable protons, was examined with the aid of deuterium-labeled analogs and quantum-chemical modeling. The fragmentation of oligoproline containing quaternary ammonium group involves the mobilization of hydrogens localized at α- and γ- or δ-carbon atoms in the pyrrolidine ring of proline. The study of the dissociation pattern highlights the unusual proline residue behavior during MS/MS experiments of peptides.

  6. Lightweight and Easily Foldable MCMB-MWCNTs Composite Paper with Exceptional Electromagnetic Interference Shielding.

    PubMed

    Chaudhary, Anisha; Kumari, Saroj; Kumar, Rajeev; Teotia, Satish; Singh, Bhanu Pratap; Singh, Avanish Pratap; Dhawan, S K; Dhakate, Sanjay R

    2016-04-27

    Lightweight and easily foldable with high conductivity, multiwalled carbon nanotube (MWCNT)-based mesocarbon microbead (MCMB) composite paper is prepared using a simple, efficient, and cost-effective strategy. The developed lightweight and conductive composite paper have been reported for the first time as an efficient electromagnetic interference (EMI) shielding material in X-band frequency region having a low density of 0.26 g/cm(3). The investigation revealed that composite paper shows an excellent absorption dominated EMI shielding effectiveness (SE) of -31 to -56 dB at 0.15-0.6 mm thickness, respectively. Specific EMI-SE of as high as -215 dB cm(3)/g exceeds the best values of metal and other low-density carbon-based composites. Additionally, lightweight and easily foldable ability of this composite paper will help in providing stable EMI shielding values even after constant bending. Such intriguing performances open the framework to designing a lightweight and easily foldable composite paper as promising EMI shielding material, especially in next-generation devices and for defense industries. PMID:27035889

  7. An easily regenerable enzyme reactor prepared from polymerized high internal phase emulsions.

    PubMed

    Ruan, Guihua; Wu, Zhenwei; Huang, Yipeng; Wei, Meiping; Su, Rihui; Du, Fuyou

    2016-04-22

    A large-scale high-efficient enzyme reactor based on polymerized high internal phase emulsion monolith (polyHIPE) was prepared. First, a porous cross-linked polyHIPE monolith was prepared by in-situ thermal polymerization of a high internal phase emulsion containing styrene, divinylbenzene and polyglutaraldehyde. The enzyme of TPCK-Trypsin was then immobilized on the monolithic polyHIPE. The performance of the resultant enzyme reactor was assessed according to the conversion ability of Nα-benzoyl-l-arginine ethyl ester to Nα-benzoyl-l-arginine, and the protein digestibility of bovine serum albumin (BSA) and cytochrome (Cyt-C). The results showed that the prepared enzyme reactor exhibited high enzyme immobilization efficiency and fast and easy-control protein digestibility. BSA and Cyt-C could be digested in 10 min with sequence coverage of 59% and 78%, respectively. The peptides and residual protein could be easily rinsed out from reactor and the reactor could be regenerated easily with 4 M HCl without any structure destruction. Properties of multiple interconnected chambers with good permeability, fast digestion facility and easily reproducibility indicated that the polyHIPE enzyme reactor was a good selector potentially applied in proteomics and catalysis areas. PMID:26995089

  8. INTERPRETING INDICATORS OF RANGELAND HEALTH, VERSION 4

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Land managers are in need of an assessment tool that provides a preliminary evaluation of rangeland health. Interpreting Indicators of Rangeland Health, Version 4 is the second published version of a protocol that uses 17 easily observed indicators summarized as three rangeland health attributes (s...

  9. Synthesis, Characterization, to application of water soluble and easily removable cationic pressure sensitive adhesives

    SciTech Connect

    Institute of Paper Science Technology

    2004-01-30

    In recent years, the world has expressed an increasing interest in the recycling of waste paper to supplement the use of virgin fiber as a way to protect the environment. Statistics show that major countries are increasing their use of recycled paper. For example, in 1991 to 1996, the U.S. increased its recovered paper utilization rate from 31% to 39%, Germany went from 50% to 60%, the UK went from 60% to 70%, France increased from 46% to 49%, and China went from 32% to 35% [1]. As recycled fiber levels and water system closures both increase, recycled product quality will need to improve in order for recycled products to compete with products made from virgin fiber [2]. The use of recycled fiber has introduced an increasing level of metal, plastic, and adhesive contamination into the papermaking process which has added to the complexity of the already overwhelming task of providing a uniform and clean recycle furnish. The most harmful of these contaminates is a mixture of adhesives and polymeric substances that are commonly known as stickies. Stickies, which enter the mill with the pulp furnish, are not easily removed from the repulper and become more difficult the further down the system they get. This can be detrimental to the final product quality. Stickies are hydrophobic, tacky, polymeric materials that are introduced into the papermaking system from a mixture of recycled fiber sources. Properties of stickies are very similar to the fibers used in papermaking, viz. size, density, hydrophobicity, and electrokinetic charge. This reduces the probability of their removal by conventional separation processes, such as screening and cleaning, which are based on such properties. Also, their physical and chemical structure allows for them to extrude through screens, attach to fibers, process equipment, wires and felts. Stickies can break down and then reagglomerate and appear at seemingly any place in the mill. When subjected to a number of factors including changes in pH, temperature, concentration, charge, and shear forces, stickies can deposit [3]. These deposits can lead to decreased runnability, productivity and expensive downtime. If the stickie remains in the stock, then machine breaks can be common. Finally, if the stickie is not removed or deposited, it will either leave in the final product causing converting and printing problems or recirculate within the mill. It has been estimated that stickies cost the paper industry between $600 and $700 million a year due to the cost of control methods and lost production attributed to stickies [3]. Also, of the seven recycling mills opened in the United States between 1994 and 1997, four have closed citing stickies as the main reason responsible for the closure [4]. Adhesives are widely used throughout the paper and paperboard industry and are subsequently found in the recycled pulp furnish. Hodgson stated that even the best stock preparation process can only remove 99% of the contaminants, of which the remaining 1% is usually adhesives of various types which are usually 10-150 microns in effective diameter [5]. The large particles are removed by mechanical means such as cleaners and screens, and the smaller, colloidal particles can be removed with washing. The stickies that pass through the cleaning and screening processes cause 95% of the problems associated with recycling [6]. The cleaners will remove most of the stickies that have a density varying from the pulp slurry ({approx}1.0 g/cm3) and will accept stickies with densities ranging from 0.95-1.05 g/cm3 [2]. The hydrophobicity of the material is also an important characteristic of the stickie [7]. The hydrophobicity causes the stickies to agglomerate with other hydrophobic materials such as other stickies, lignin, and even pitch. The tacky and viscous nature of stickies contributes to many product and process problems, negatively affecting the practicality of recycled fiber use. The source of stickies that evade conventional removal techniques are usually synthetic polymers, including acrylates, styrene butadiene rubber, vinyl acetates, and polypropylene [5,6,8-12]. Sources of these adhesives are usually broken down into categories based on application.

  10. Summary and interpretive synthesis

    SciTech Connect

    1995-05-01

    This chapter summarizes the major advances made through our integrated geological studies of the Lisburne Group in northern Alaska. The depositional history of the Lisburne Group is discussed in a framework of depositional sequence stratigraphy. Although individual parasequences (small-scale carbonate cycles) of the Wahoo Limestone cannot be correlated with certainty, parasequence sets can be interpreted as different systems tracts within the large-scale depositional sequences, providing insights on the paleoenvironments, paleogeography and platform geometry. Conodont biostratigraphy precisely established the position of the Mississippian-Pennsylvanian boundary within an important reference section, where established foraminiferal biostratigraphy is inconsistent with respect to conodont-based time-rock boundaries. However, existing Carboniferous conodont zonations are not readily applicable because most zonal indicators are absent, so a local zonation scheme was developed. Diagenetic studies of the Lisburne Group recognized nineteen subaerial exposure surfaces and developed a cement stratigraphy that includes: early cements associated with subaerial exposure surfaces in the Lisburne Group; cements associated with the sub-Permian unconformity; and later burial cements. Subaerial exposure surfaces in the Alapah Limestone are easily explained, being associated with peritidal environments at the boundaries of Sequence A. The Lisburne exposed in ANWR is generally tightly cemented and supermature, but could still be a good reservoir target in the adjacent subsurface of ANWR given the appropriate diagenetic, deformational and thermal history. Our ongoing research on the Lisburne Group will hopefully provide additional insights in future publications.

  11. Statistical Reform in School Psychology Research: A Synthesis

    ERIC Educational Resources Information Center

    Swaminathan, Hariharan; Rogers, H. Jane

    2007-01-01

    Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.

  12. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-07-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data to statistical methods and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any data set easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a data set is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology, we chose two large data sets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose the incorporation of this analysis into future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  13. Tracking and Feature Extraction of Easily Deformable Object Using Particle Filter and Adaptive Vector Quantization

    NASA Astrophysics Data System (ADS)

    Nishida, Takeshi; Ikoma, Norikazu; Kurogi, Shuichi; Sakamoto, Tetsuzo

    PF-mCRL method is a rapid and robust information extraction method for non-Gaussian probability distribution by combination of a particle filter (PF) and an adaptive vector quantization algorithm mCRL (modified Competitive Re-initialization Learning). In this research, a novel method for tracking and shape estimation of easily deformable object in dynamic scene by using the PF-mCRL is proposed. Moreover, several feature value extraction methods from output of PF-mCRL useful for the robot handling are proposed. Further, effectiveness of this proposed method is shown by a real image experiments.

  14. A transportable and easily configurable multi-projector display system for distributed virtual reality applications

    NASA Astrophysics Data System (ADS)

    Grimes, Holly; McMenemy, Karen R.; Ferguson, R. S.

    2008-02-01

    This paper details how simple PC software, a small network of consumer level PCs, some do-it-yourself hardware and four low cost video projectors can be combined to form an easily configurable and transportable projection display with applications in virtual reality training. This paper provides some observations on the practical difficulties of using such a system, its effectiveness in delivering a VE for training and what benefit may be offered through the deployment of a large number of these low cost environments.

  15. Standardization of electrocardiographic interpretive statements: a menu for word processing.

    PubMed Central

    Dower, G. E.; Osborne, J. A.; Machado, H. B.; Stewart, D. E.

    1979-01-01

    Standardization of electrocardiographic interpretive statements is a goal of various coding systems, but word processing has not usually been considered. A simple, easily memorized system for clinical electrocardiography has been developed and used for approximately 60 000 interpretations. It takes the form of a "menu", in which boxes stand for various interpretive statements; the boxes are identified by mnemonics and marked by the interpreter when appropriate. The results provide better standardization, significant decreases in the numbers of descriptive statements and words per interpretation and considerable saving in typing time. Acceptance by the interpreters has been good. Features of the system allow for word processing as part of a polarcardiography computing system. PMID:427688

  16. Safe, Effective and Easily Reproducible Fusion Technique for CV Junction Instability

    PubMed Central

    Sannegowda, Raghavendra Bakki

    2015-01-01

    Introduction: The Craniovertebral junction (CVJ) refers to a bony enclosure where the occipital bone surrounds the foramen magnum, the atlas and the axis vertebrae. Because of the complexity of structures, CVJ instability is associated with diagnostic and therapeutic problems. Posterior CV fusion procedures have evolved a lot over the last couple of decades. There has been a lookout for one such surgical procedure which is inherently safe, simple, easily reproducible and biomechanically sound. In our study, we present the initial experience the cases of CV junction instrumentation using O-C1-C2 screw & rod construct operated by the author. Aims and Objectives: The current study is a descriptive analysis of the cases of CVJ instability treated by us with instrumentation using O-C1-C2 screw and rod construct fusion technique. Materials and Methods: It is a retrospective, analytical study in which cases of CV junction instability operated by the author between January 2010 to March 2014 were analysed using various clinical, radiological and outcome parameters. Conclusion: CV junction instrumentation using O-C1-C2 screw and rod construct fusion technique proved to be safe, effective, easily reproducible and biomechanically sound technique which can be adopted by all surgeons who may be at any stage of their learning curve. PMID:25954660

  17. A Graphical Interpretation of Probit Coefficients.

    ERIC Educational Resources Information Center

    Becker, William E.; Waldman, Donald M.

    1989-01-01

    Contends that, when discrete choice models are taught, particularly the probit model, it is the method rather than the interpretation of the results that is emphasized. This article provides a graphical technique for interpretation of an estimated probit coefficient that will be useful in statistics and econometrics courses. (GG)

  18. The study on development of easily chewable and swallowable foods for elderly

    PubMed Central

    Kim, Soojeong

    2015-01-01

    BACKGROUND/OBJECTS When the functions involved in the ingestion of food occurs failure, not only loss of enjoyment of eating, it will be faced with protein-energy malnutrition. Dysmasesis and difficulty of swallowing occurs in various diseases, but it may be a major cause of aging, and elderly people with authoring and dysmasesis and difficulty of swallowing in the aging society is expected to increase rapidly. SUBJECTS/METHODS In this study, we carried out a survey targeting nutritionists who work in elderly care facilities, and examined characteristics of offering of foods for elderly and the degree of demand of development of easily chewable and swallowable foods for the elderly who can crush foods and take that by their own tongues, and sometimes have difficulty in drinking water and tea. RESULTS In elderly care facilities, it was found to provide a finely chopped food or ground food that was ground with water in a blender for elderly with dysmasesis. Elderly satisfaction of provided foods is appeared overall low. Results of investigating the applicability of foods for elderly and the reflection will of menus, were showed the highest response rate in a gelification method in molecular gastronomic science technics, and results of investigating the frequent food of the elderly; representative menu of beef, pork, white fish, anchovies and spinach, were showed Korean barbecue beef, hot pepper paste stir fried pork, pan fried white fish, stir fried anchovy, seasoned spinach were the highest offer frequency. CONCLUSIONS This study will provide the fundamentals of the development of easily chewable and swallowable foods, gelification, for the elderly. The study will also illustrate that, in the elderly, food undergone gelification will reduce the risk of swallowing down to the wrong pipe and improve overall food preference. PMID:26244082

  19. Easily processable multimodal spectral converters based on metal oxide/organic—inorganic hybrid nanocomposites

    NASA Astrophysics Data System (ADS)

    Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P.; Freitas, Vânia T.; André, Paulo S.; Carlos, Luis D.; Ferreira, Rute A. S.

    2015-10-01

    This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er3+, Yb3+ codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er3+- and Yb3+-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices.

  20. Easily processable multimodal spectral converters based on metal oxide/organic-inorganic hybrid nanocomposites.

    PubMed

    Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P; Freitas, Vânia T; André, Paulo S; Carlos, Luis D; Ferreira, Rute A S

    2015-10-01

    This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er(3+), Yb(3+) codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er(3+)- and Yb(3+)-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices. PMID:26374133

  1. Interpretation Training Influences Memory for Prior Interpretations

    PubMed Central

    Salemink, Elske; Hertel, Paula; Mackintosh, Bundy

    2010-01-01

    Anxiety is associated with memory biases when the initial interpretation of the event is taken into account. This experiment examined whether modification of interpretive bias retroactively affects memory for prior events and their initial interpretation. Before training, participants imagined themselves in emotionally ambiguous scenarios to which they provided endings that often revealed their interpretations. Then they were trained to resolve the ambiguity in other situations in a consistently positive (n = 37) or negative way (n = 38) before they tried to recall the initial scenarios and endings. Results indicated that memory for the endings was imbued with the emotional tone of the training, whereas memory for the scenarios was unaffected. PMID:21171760

  2. Descriptive statistics.

    PubMed

    Shi, Runhua; McLarty, Jerry W

    2009-10-01

    In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications. PMID:19891281

  3. Statistical Software.

    ERIC Educational Resources Information Center

    Callamaras, Peter

    1983-01-01

    This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)

  4. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.

  5. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  6. Motivating Play Using Statistical Reasoning

    ERIC Educational Resources Information Center

    Cross Francis, Dionne I.; Hudson, Rick A.; Lee, Mi Yeon; Rapacki, Lauren; Vesperman, Crystal Marie

    2014-01-01

    Statistical literacy is essential in everyone's personal lives as consumers, citizens, and professionals. To make informed life and professional decisions, students are required to read, understand, and interpret vast amounts of information, much of which is quantitative. To develop statistical literacy so students are able to make sense of…

  7. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-03-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used characteristics to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any dataset easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a dataset is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology we chose two large datasets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose add the use of this analysis for future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  8. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. PMID:26466186

  9. Open Window: When Easily Identifiable Genomes and Traits Are in the Public Domain

    PubMed Central

    Angrist, Misha

    2014-01-01

    “One can't be of an enquiring and experimental nature, and still be very sensible.” - Charles Fort [1] As the costs of personal genetic testing “self-quantification” fall, publicly accessible databases housing people's genotypic and phenotypic information are gradually increasing in number and scope. The latest entrant is openSNP, which allows participants to upload their personal genetic/genomic and self-reported phenotypic data. I believe the emergence of such open repositories of human biological data is a natural reflection of inquisitive and digitally literate people's desires to make genomic and phenotypic information more easily available to a community beyond the research establishment. Such unfettered databases hold the promise of contributing mightily to science, science education and medicine. That said, in an age of increasingly widespread governmental and corporate surveillance, we would do well to be mindful that genomic DNA is uniquely identifying. Participants in open biological databases are engaged in a real-time experiment whose outcome is unknown. PMID:24647311

  10. Shaft seals with an easily removable cylinder holder for low-pressure steam turbines

    NASA Astrophysics Data System (ADS)

    Zakharov, A. E.; Rodionov, D. A.; Pimenov, E. V.; Sobolev, A. S.

    2016-01-01

    The article is devoted to the problems that occur at the operation of LPC shaft seals (SS) of turbines, particularly, their bearings. The problems arising from the deterioration of oil-protecting rings of SS and bearings and also the consequences in which they can result are considered. The existing SS housing construction types are considered. Their operational features are specified. A new SS construction type with an easily removable holder is presented. The construction of its main elements is described. The sequence of operations of the repair personnel at the restoration of the new SS type spacings is proposed. The comparative analysis of the new and the existing SS construction types is carried out. The assessment results of the efficiency, the operational convenience, and the economic effect after the installation of the new type seals are given. The conclusions about the offered construction prospects are made by results of the comparative analysis and the carried-out assessment. The main advantage of this design is the possibility of spacings restoration both in SS and in oil-protecting rings during a short-term stop of a turbine, even without its cooling. This construction was successfully tested on the working K-300-23.5 LMP turbine. However, its adaptation for other turbines is quite possible.

  11. Predicting protein interface residues using easily accessible on-line resources.

    PubMed

    Maheshwari, Surabhi; Brylinski, Michal

    2015-11-01

    It has been more than a decade since the completion of the Human Genome Project that provided us with a complete list of human proteins. The next obvious task is to figure out how various parts interact with each other. On that account, we review 10 methods for protein interface prediction, which are freely available as web servers. In addition, we comparatively evaluate their performance on a common data set comprising different quality target structures. We find that using experimental structures and high-quality homology models, structure-based methods outperform those using only protein sequences, with global template-based approaches providing the best performance. For moderate-quality models, sequence-based methods often perform better than those structure-based techniques that rely on fine atomic details. We note that post-processing protocols implemented in several methods quantitatively improve the results only for experimental structures, suggesting that these procedures should be tuned up for computer-generated models. Finally, we anticipate that advanced meta-prediction protocols are likely to enhance interface residue prediction. Notwithstanding further improvements, easily accessible web servers already provide the scientific community with convenient resources for the identification of protein-protein interaction sites. PMID:25797794

  12. Sexual dimorphism in venom chemistry in Tetragnatha spiders is not easily explained by adult niche differences.

    PubMed

    Binford, Greta J; Gillespie, Rosemary G; Maddison, Wayne P

    2016-05-01

    Spider venom composition typically differs between sexes. This pattern is anecdotally thought to reflect differences in adult feeding biology. We used a phylogenetic approach to compare intersexual venom dimorphism between species that differ in adult niche dimorphism. Male and female venoms were compared within and between related species of Hawaiian Tetragnatha, a mainland congener, and outgroups. In some species of Hawaiian Tetragnatha adult females spin orb-webs and adult males capture prey while wandering, while in other species both males and females capture prey by wandering. We predicted that, if venom sexual dimorphism is primarily explained by differences in adult feeding biology, species in which both sexes forage by wandering would have monomorphic venoms or venoms with reduced dimorphism relative to species with different adult feeding biology. However, we found striking sexual dimorphism in venoms of both wandering and orb-weaving Tetragnatha species with males having high molecular weight components in their venoms that were absent in females, and a reduced concentration of low molecular weight components relative to females. Intersexual differences in venom composition within Tetragnatha were significantly larger than in non-Tetragnatha species. Diet composition was not different between sexes. This striking venom dimorphism is not easily explained by differences in feeding ecology or behavior. Rather, we hypothesize that the dimorphism reflects male-specific components that play a role in mating biology possibly in sexual stimulation, nuptial gifts and/or mate recognition. PMID:26908290

  13. Easily Regenerable Solid Adsorbents Based on Polyamines for Carbon Dioxide Capture from the Air

    SciTech Connect

    Goeppert, A; Zhang, H; Czaun, M; May, RB; Prakash, GKS; Olah, GA; Narayanan, SR

    2014-03-18

    Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle.

  14. An easily reversible structural change underlies mechanisms enabling desert crust cyanobacteria to survive desiccation.

    PubMed

    Bar-Eyal, Leeat; Eisenberg, Ido; Faust, Adam; Raanan, Hagai; Nevo, Reinat; Rappaport, Fabrice; Krieger-Liszkay, Anja; Sétif, Pierre; Thurotte, Adrien; Reich, Ziv; Kaplan, Aaron; Ohad, Itzhak; Paltiel, Yossi; Keren, Nir

    2015-10-01

    Biological desert sand crusts are the foundation of desert ecosystems, stabilizing the sands and allowing colonization by higher order organisms. The first colonizers of the desert sands are cyanobacteria. Facing the harsh conditions of the desert, these organisms must withstand frequent desiccation-hydration cycles, combined with high light intensities. Here, we characterize structural and functional modifications to the photosynthetic apparatus that enable a cyanobacterium, Leptolyngbya sp., to thrive under these conditions. Using multiple in vivo spectroscopic and imaging techniques, we identified two complementary mechanisms for dissipating absorbed energy in the desiccated state. The first mechanism involves the reorganization of the phycobilisome antenna system, increasing excitonic coupling between antenna components. This provides better energy dissipation in the antenna rather than directed exciton transfer to the reaction center. The second mechanism is driven by constriction of the thylakoid lumen which limits diffusion of plastocyanin to P700. The accumulation of P700(+) not only prevents light-induced charge separation but also efficiently quenches excitation energy. These protection mechanisms employ existing components of the photosynthetic apparatus, forming two distinct functional modes. Small changes in the structure of the thylakoid membranes are sufficient for quenching of all absorbed energy in the desiccated state, protecting the photosynthetic apparatus from photoinhibitory damage. These changes can be easily reversed upon rehydration, returning the system to its high photosynthetic quantum efficiency. PMID:26188375

  15. Easily regenerable solid adsorbents based on polyamines for carbon dioxide capture from the air.

    PubMed

    Goeppert, Alain; Zhang, Hang; Czaun, Miklos; May, Robert B; Prakash, G K Surya; Olah, George A; Narayanan, S R

    2014-05-01

    Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle. PMID:24644023

  16. Efficient transformation of grease to biodiesel using highly active and easily recyclable magnetic nanobiocatalyst aggregates.

    PubMed

    Ngo, Thao P N; Li, Aitao; Tiew, Kang W; Li, Zhi

    2013-10-01

    Green and efficient production of biodiesel (FAME) from waste grease containing high amount of free fatty acid (FFA) was achieved by using novel magnetic nanobiocatalyst aggregates (MNA). Thermomyces lanuginosus Lipase (TLL) and Candida antarctica Lipase B (CALB) were covalently immobilized on core-shell structured iron oxide magnetic nanoparticle (80 nm), respectively, followed by freeze-dry to give MNA (13-17 μm) with high yield (80-89%) and high enzyme loading (61 mg TLL or 22 mg CALB per gram MNA). MNA TL showed the best performance among immobilized enzymes known thus for the production of FAME from grease (17 wt.% FFA) with methanol, giving 99% yield in 12 h (3.3 wt.% catalyst). MNA TL was easily separated under magnetic field and reused, retaining 88% productivity in 11th cycle. MNA CA converted >97% FFA in grease (17 wt.% FFA) to FAME in 12 h (0.45 wt.% catalyst), being useful in two-step transformation of grease to biodiesel. PMID:23298767

  17. Easily separated silver nanoparticle-decorated magnetic graphene oxide: Synthesis and high antibacterial activity.

    PubMed

    Zhang, Huai-Zhi; Zhang, Chang; Zeng, Guang-Ming; Gong, Ji-Lai; Ou, Xiao-Ming; Huan, Shuang-Yan

    2016-06-01

    Silver nanoparticle-decorated magnetic graphene oxide (MGO-Ag) was synthesized by doping silver and Fe3O4 nanoparticles on the surface of GO, which was used as an antibacterial agent. MGO-Ag was characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), Energy dispersive X-ray (EDS), X-ray diffraction (XRD), Raman spectroscopy and magnetic property tests. It can be found that magnetic iron oxide nanoparticles and nano-Ag was well dispersed on graphene oxide; and MGO-Ag exhibited excellent antibacterial activity against Escherichia coli and Staphylococcus aureus. Several factors were investigated to study the antibacterial effect of MGO-Ag, such as temperature, time, pH and bacterial concentration. We also found that MGO-Ag maintained high inactivation rates after use six times and can be separated easily after antibacterial process. Moreover, the antibacterial mechanism is discussed and the synergistic effect of GO, Fe3O4 nanoparticles and nano-Ag accounted for high inactivation of MGO-Ag. PMID:26994349

  18. Nonlinear statistical coupling

    NASA Astrophysics Data System (ADS)

    Nelson, Kenric P.; Umarov, Sabir

    2010-06-01

    By considering a nonlinear combination of the probabilities of a system, a physical interpretation of Tsallis statistics as representing the nonlinear coupling or decoupling of statistical states is proposed. The escort probability is interpreted as the coupled probability, with Q=1-q defined as the degree of nonlinear coupling between the statistical states. Positive values of Q have coupled statistical states, a larger entropy metric, and a maximum coupled-entropy distribution of compact-support coupled-Gaussians. Negative values of Q have decoupled statistical states and for -2

  19. Development of a numerical atlas of the easily flooded zones by marine immersions of the sandy littoral of Languedoc Roussillon (France)

    NASA Astrophysics Data System (ADS)

    Christophe, Esposito

    2010-05-01

    The Regional Direction of the Infrastructure (France) entrusted to the Technical Studies Center of the Infrastructure (CETE Mediterranee) the study of a numerical atlas of the easily flooded area by marine immersions of the sandy littoral of Languedoc Roussillon. The objective of this paper is to present the methodological results. To do the map making of the easily flooded area by marine immersions (storm), we used several numerical data base. We can list, for example, the "BD Topo Pays" and the aerial photography of the National Geographical Institute (IGN), the geological mapping of the Geological and Mining Researsh Department (BRGM). To complete this data, we have realised a geomorphological interpretation of the littoral with the aerial photography. This naturalist approach can give the geomorphological object (beach, sand dune, ...) of the sandy littoral. Our objective was to determinate the limit about coastal plain (flooded by storm) and the alluvial plain (flooded by overfloowing) and not liable to flooding form. In the first phase of the study, a progressive methodology was used to develop a version of the numerical atlas based on the available geographical data of geomorphological, historical and topographic nature. During the second phase, we have developed this approach on the four french's department (Pyrnes-Orientales, Aude, Hrault and Gard). The result is the map making of the easily flooded area by marine immersions for 230 km of the sandy littoral. This mapping define the geomorphological factor of the littoral. Like this, we can found a qualitative hazard about marine immersions. Keywords : Storm, Marine immersions, Atlas of the easily flooded zones, Languedoc-Roussillon, France

  20. OntologyWidget – a reusable, embeddable widget for easily locating ontology terms

    PubMed Central

    Beauheim, Catherine C; Wymore, Farrell; Nitzberg, Michael; Zachariah, Zachariah K; Jin, Heng; Skene, JH Pate; Ball, Catherine A; Sherlock, Gavin

    2007-01-01

    Background Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. Results We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML) and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD), which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO) website [1]. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1) install Apache Tomcat [2] on one's web server, (2) download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3) create an html (HyperText Markup Language) file that refers to the OntologyWidget using a simple, well-defined format. Conclusion We have developed OntologyWidget, an easy-to-use ontology search and display tool that can be used on any web page by creating a simple html description. OntologyWidget provides a rapid auto-complete search function paired with an interactive tree display. We have developed a web service layer that communicates between the web page interface and a database of ontology terms. We currently store 40 of the ontologies from the OBO website [1], as well as a several others. These ontologies are automatically updated on a weekly basis. OntologyWidget can be used in any web-based application to take advantage of the ontologies we provide via web services or any other ontology that is provided elsewhere in the correct format. The full source code for the JavaScript and description of the OntologyWidget is available from . PMID:17854506

  1. Easily-handled method to isolate mesenchymal stem cells from coagulated human bone marrow samples

    PubMed Central

    Wang, Heng-Xiang; Li, Zhi-Yong; Guo, Zhi-Kun; Guo, Zi-Kuan

    2015-01-01

    AIM: To establish an easily-handled method to isolate mesenchymal stem cells (MSCs) from coagulated human bone marrow samples. METHODS: Thrombin was added to aliquots of seven heparinized human bone marrow samples to mimic marrow coagulation. The clots were untreated, treated with urokinase or mechanically cut into pieces before culture for MSCs. The un-coagulated samples and the clots were also stored at 4 °C for 8 or 16 h before the treatment. The numbers of colony-forming unit-fibroblast (CFU-F) in the different samples were determined. The adherent cells from different groups were passaged and their surface profile was analyzed with flow cytometry. Their capacities of in vitro osteogenesis and adipogenesis were observed after the cells were exposed to specific inductive agents. RESULTS: The average CFU-F number of urokinase-treated samples (16.85 ± 11.77/106) was comparable to that of un-coagulated control samples (20.22 ± 10.65/106, P = 0.293), which was significantly higher than those of mechanically-cut clots (6.5 ± 5.32/106, P < 0.01) and untreated clots (1.95 ± 1.86/106, P < 0.01). The CFU-F numbers decreased after samples were stored, but those of control and urokinase-treated clots remained higher than the other two groups. Consistently, the numbers of the attached cells at passage 0 were higher in control and urokinase-treated clots than those of mechanically-cut clots and untreated clots. The attached cells were fibroblast-like in morphology and homogenously positive for CD44, CD73 and CD90, and negative for CD31 and CD45. Also, they could be induced to differentiate into osteoblasts and adipocytes in vitro. CONCLUSION: Urokinase pretreatment is an optimal strategy to isolate MSCs from human bone marrow samples that are poorly aspirated and clotted. PMID:26435773

  2. Cholesteryl ester storage disease: an easily missed diagnosis in oligosymptomatic children.

    PubMed

    Freudenberg, F; Bufler, P; Ensenauer, R; Lohse, P; Koletzko, S

    2013-10-01

    Cholesteryl ester storage disease (CESD) is a rare, autosomal recessively inherited disorder resulting from deficient activity of lysosomal acid lipase (LAL). LAL is the key enzyme hydrolyzing cholesteryl esters and triglycerides stored in lysosomes after LDL receptor-mediated endocytosis. Mutations within the LIPA gene locus on chromosome 10q23.2-q23.3 may result either in the always fatal Wolman disease, where no LAL activity is found, or in the more benign disorder CESD with a reduced enzymatic activity, leading to massive accumulation of cholesteryl esters and triglycerides in many body tissues. CESD affects mostly the liver, the spectrum is ranging from isolated hepatomegaly to liver cirrhosis. Chronic diarrhea has been reported in some pediatric cases, while calcifications of the adrenal glands, the hallmark of Wolman disease, are rarely observed. Hypercholesterolemia and premature atherosclerosis are other typical disease manifestations. Hepatomegaly as a key finding has been reported in all 71 pediatric patients and in 134 of 135 adult cases in the literature. We present a 13-year-old boy with mildly elevated liver enzymes in the absence of hepatomegaly, finally diagnosed with CESD. Under pravastatine treatment, the patient has normal laboratory findings and is clinically unremarkable since 5 years of follow-up. To our knowledge, this is the first pediatric case of genetically and biopsy confirmed CESD without hepatomegaly, suggesting that this diagnosis can be easily missed. It further raises the question about the natural course and the therapy required for this oligosymptomatic form. PMID:24122380

  3. The Sclerotic Scatter Limbal Arc Is More Easily Elicited under Mesopic Rather Than Photopic Conditions

    PubMed Central

    Denion, Eric; Lux, Anne-Laure; Mouriaux, Frédéric; Béraud, Guillaume

    2016-01-01

    Introduction We aimed to determine the limbal lighting illuminance thresholds (LLITs) required to trigger perception of sclerotic scatter at the opposite non-illuminated limbus (i.e. perception of a light limbal scleral arc) under different levels of ambient lighting illuminance (ALI). Material and Methods Twenty healthy volunteers were enrolled. The iris shade (light or dark) was graded by retrieving the median value of the pixels of a pre-determined zone of a gray-level iris photograph. Mean keratometry and central corneal pachymetry were recorded. Each subject was asked to lie down, and the ALI at eye level was set to mesopic values (10, 20, 40 lux), then photopic values (60, 80, 100, 150, 200 lux). For each ALI level, a light beam of gradually increasing illuminance was applied to the right temporal limbus until the LLIT was reached, i.e. the level required to produce the faint light arc that is characteristic of sclerotic scatter at the nasal limbus. Results After log-log transformation, a linear relationship between the logarithm of ALI and the logarithm of the LLIT was found (p<0.001), a 10% increase in ALI being associated with an average increase in the LLIT of 28.9%. Higher keratometry values were associated with higher LLIT values (p = 0.008) under low ALI levels, but the coefficient of the interaction was very small, representing a very limited effect. Iris shade and central corneal thickness values were not significantly associated with the LLIT. We also developed a censored linear model for ALI values ≤ 40 lux, showing a linear relationship between ALI and the LLIT, in which the LLIT value was 34.4 times greater than the ALI value. Conclusion Sclerotic scatter is more easily elicited under mesopic conditions than under photopic conditions and requires the LLIT value to be much higher than the ALI value, i.e. it requires extreme contrast. PMID:26964096

  4. Clearly written, easily comprehended? The readability of websites providing information on epilepsy.

    PubMed

    Brigo, Francesco; Otte, Willem M; Igwe, Stanley C; Tezzon, Frediano; Nardone, Raffaele

    2015-03-01

    There is a general need for high-quality, easily accessible, and comprehensive health-care information on epilepsy to better inform the general population about this highly stigmatized neurological disorder. The aim of this study was to evaluate the health literacy level of eight popular English-written websites that provide information on epilepsy in quantitative terms of readability. Educational epilepsy material on these websites, including 41 Wikipedia articles, were analyzed for their overall level of readability and the corresponding academic grade level needed to comprehend the published texts on the first reading. The Flesch Reading Ease (FRE) was used to assess ease of comprehension while the Gunning Fog Index, Coleman-Liau Index, Flesch-Kincaid Grade Level, Automated Readability Index, and Simple Measure of Gobbledygook scales estimated the corresponding academic grade level needed for comprehension. The average readability of websites yielded results indicative of a difficult-to-fairly-difficult readability level (FRE results: 44.0±8.2), with text readability corresponding to an 11th academic grade level (11.3±1.9). The average FRE score of the Wikipedia articles was indicative of a difficult readability level (25.6±9.5), with the other readability scales yielding results corresponding to a 14th grade level (14.3±1.7). Popular websites providing information on epilepsy, including Wikipedia, often demonstrate a low level of readability. This can be ameliorated by increasing access to clear and concise online information on epilepsy and health in general. Short "basic" summaries targeted to patients and nonmedical users should be added to articles published in specialist websites and Wikipedia to ease readability. PMID:25601720

  5. Interpreting. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    Darroch, Kathy; Marshall, Liza

    1998-01-01

    An interpreter's role is to facilitate communication and convey all auditory and signed information so that both hearing and deaf individuals may fully interact. The common types of services provided by interpreters are: (1) American Sign Language (ASL) Interpretation--a visual-gestural language with its own linguistic features; (2) Sign Language…

  6. Interpreting. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    Darroch, Kathleen

    2010-01-01

    An interpreter's role is to facilitate communication and convey all auditory and signed information so that both hearing and deaf individuals may fully interact. The common types of services provided by interpreters are: (1) American Sign Language (ASL) Interpretation--a visual-gestural language with its own linguistic features; (2) Sign Language…

  7. Enhancing Table Interpretation Skills via Training in Table Creation

    ERIC Educational Resources Information Center

    Karazsia, Bryan T.

    2013-01-01

    Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents a new technique for enhancing student interpretation of American Psychological…

  8. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  9. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the statistical…

  10. The emergent Copenhagen interpretation of quantum mechanics

    NASA Astrophysics Data System (ADS)

    Hollowood, Timothy J.

    2014-05-01

    We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR-Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems.

  11. Interpreting Abstract Interpretations in Membership Equational Logic

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Rosu, Grigore

    2001-01-01

    We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.

  12. Preparation and Use of an Easily Constructed, Inexpensive Chamber for Viewing Courtship Behaviors of Fruit Flies, Drosophila sp.

    ERIC Educational Resources Information Center

    Christensen, Timothy J.; Labov, Jay B.

    1997-01-01

    Details the construction of a viewing chamber for fruit flies that connects to a dissecting microscope and features a design that enables students to easily move fruit flies in and out of the chamber. (DDR)

  13. Perception in statistical graphics

    NASA Astrophysics Data System (ADS)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  14. Automatic interpretation of biological tests.

    PubMed

    Boufriche-Boufaïda, Z

    1998-03-01

    In this article, an approach for an Automatic Interpretation of Biological Tests (AIBT) is described. The developed system is much needed in Preventive Medicine Centers (PMCs). It is designed as a self-sufficient system that could be easily used by trained nurses during the routine visit. The results that the system provides are not only useful to provide the PMC physicians with a preliminary diagnosis, but also allows them more time to focus on the serious cases, making the clinical visit more qualitative. On the other hand, because the use of such a system has been planned for many years, its possibilities for future extensions must be seriously considered. The methodology adopted can be interpreted as a combination of the advantages of two main approaches adopted in current diagnostic systems: the production system approach and the object-oriented system approach. From the rules, the ability of these approaches to capture the deductive processes of the expert in domains where causal mechanisms are often understood are retained. The object-oriented approach guides the elicitation and the engineering of knowledge in such a way that abstractions, categorizations and classifications are encouraged whilst individual instances of objects of any type are recognized as separate, independent entities. PMID:9684093

  15. SEER Statistics

    Cancer.gov

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  16. Quick Statistics

    MedlinePlus

    ... population, or about 25 million Americans, has experienced tinnitus lasting at least five minutes in the past ... by NIDCD Epidemiology and Statistics Program staff: (1) tinnitus prevalence was obtained from the 2008 National Health ...

  17. Interpretation domestic and foreign.

    PubMed

    Vega, Jason A Wheeler

    2012-10-01

    Verbal and nonverbal behavior are on all fours when it comes to interpretation. This idea runs counter to an intuition that, to borrow a phrase, speech is cooked but action is raw. The author discusses some of the most compelling psychoanalytic work on the interpretation of action and presents empirical and philosophical findings about understanding speech. These concepts generate reciprocal implications about the possibility of interpreting the exotics of action and the necessity of interpreting the domestics of speech, treating both as equally dignified aspects of human behavior. The author presents a number of clinical examples to further illustrate these ideas. PMID:23326999

  18. Crying without a cause and being easily upset in two-year-olds: heritability and predictive power of behavioral problems.

    PubMed

    Groen-Blokhuis, Maria M; Middeldorp, Christel M; M van Beijsterveldt, Catharina E; Boomsma, Dorret I

    2011-10-01

    In order to estimate the influence of genetic and environmental factors on 'crying without a cause' and 'being easily upset' in 2-year-old children, a large twin study was carried out. Prospective data were available for ~18,000 2-year-old twin pairs from the Netherlands Twin Register. A bivariate genetic analysis was performed using structural equation modeling in the Mx software package. The influence of maternal personality characteristics and demographic and lifestyle factors was tested to identify specific risk factors that may underlie the shared environment of twins. Furthermore, it was tested whether crying without a cause and being easily upset were predictive of later internalizing, externalizing and attention problems. Crying without a cause yielded a heritability estimate of 60% in boys and girls. For easily upset, the heritability was estimated at 43% in boys and 31% in girls. The variance explained by shared environment varied between 35% and 63%. The correlation between crying without a cause and easily upset (r = .36) was explained both by genetic and shared environmental factors. Birth cohort, gestational age, socioeconomic status, parental age, parental smoking behavior and alcohol use during pregnancy did not explain the shared environmental component. Neuroticism of the mother explained a small proportion of the additive genetic, but not of the shared environmental effects for easily upset. Crying without a cause and being easily upset at age 2 were predictive of internalizing, externalizing and attention problems at age 7, with effect sizes of .28-.42. A large influence of shared environmental factors on crying without a cause and easily upset was detected. Although these effects could be specific to these items, we could not explain them by personality characteristics of the mother or by demographic and lifestyle factors, and we recognize that these effects may reflect other maternal characteristics. A substantial influence of genetic factors was found for the two items, which are predictive of later behavioral problems. PMID:21962130

  19. Quantum Mechanics and the Interpretation Problem

    NASA Astrophysics Data System (ADS)

    Lonney, Lawrence William, Jr.

    1990-01-01

    Although many well articulated approaches to theory choice exist, no general approach to interpretation choice is available. This lacking is particularly troublesome for quantum mechanics because its mathematical formalism is associated with many well-developed interpretations. The lack of a method for choosing among the various interpretations of quantum mechanics has motivated the construction of this dissertation. The search for an appropriate method focuses on two areas: attempts to establish the superiority of one particular interpretation of quantum mechanics over another and general methods for choosing one theory over another. Regarding the former area, two attempts to choose the Statistical Ensemble interpretation of quantum mechanics over the Copenhagen interpretation are analyzed. One of these is authored by L. E. Ballentine and the other by J. L. Park. The conclusion of this analysis is that both attempts did not succeed and a general approach to interpretation choice could not be extracted from either. The desired approach was eventually found in one of the general methods for choosing among theories. The essential element of this approach to interpretation choice lies in the recognition that each interpretation contains the seed of a unique research program. If the program is cultivated, it can eventually be judged relative to others which have sprouted from the same theory. The criteria for such a judgment are contained in the Methodology of Scientific Research Programmes approach to theory choice. This method is applied to the Statistical Ensemble and Copenhagen interpretations of quantum mechanics. Even though it did not result in an immediate choice between the two, it did provide guidance for identifying what is needed to make such a choice.

  20. Interpretation of psychophysics response curves using statistical physics.

    PubMed

    Knani, S; Khalfaoui, M; Hachicha, M A; Mathlouthi, M; Ben Lamine, A

    2014-05-15

    Experimental gustatory curves have been fitted for four sugars (sucrose, fructose, glucose and maltitol), using a double layer adsorption model. Three parameters of the model are fitted, namely the number of molecules per site n, the maximum response RM and the concentration at half saturation C1/2. The behaviours of these parameters are discussed in relationship to each molecule's characteristics. Starting from the double layer adsorption model, we determined (in addition) the adsorption energy of each molecule on taste receptor sites. The use of the threshold expression allowed us to gain information about the adsorption occupation rate of a receptor site which fires a minimal response at a gustatory nerve. Finally, by means of this model we could calculate the configurational entropy of the adsorption system, which can describe the order and disorder of the adsorbent surface. PMID:24423561

  1. The Statistical Literacy Needed to Interpret School Assessment Data

    ERIC Educational Resources Information Center

    Chick, Helen; Pierce, Robyn

    2013-01-01

    State-wide and national testing in areas such as literacy and numeracy produces reports containing graphs and tables illustrating school and individual performance. These are intended to inform teachers, principals, and education organisations about student and school outcomes, to guide change and improvement. Given the complexity of the…

  2. Prosody and Interpretation

    ERIC Educational Resources Information Center

    Erekson, James A.

    2010-01-01

    Prosody is a means for "reading with expression" and is one aspect of oral reading competence. This theoretical inquiry asserts that prosody is central to interpreting text, and draws distinctions between "syntactic" prosody (for phrasing) and "emphatic" prosody (for interpretation). While reading with expression appears as a criterion in major…

  3. Higher Education Interpreting.

    ERIC Educational Resources Information Center

    Woll, Bencie; Porcari li Destri, Giulia

    This paper discusses issues related to the training and provision of interpreters for deaf students at institutions of higher education in the United Kingdom. Background information provided notes the increasing numbers of deaf and partially hearing students, the existence of funding to pay for interpreters, and trends in the availability of…

  4. Centralised interpretation of electrocardiograms.

    PubMed Central

    Macfarlane, P W; Watts, M P; Lawrie, T D; Walker, R S

    1977-01-01

    A system was devised so that a peripheral hospital could transmit electrocardiograms (ECGs) to a central computer for interpretation. The link that transmits both ECGs and reports is provided by the telephone network. Initial results showed that telephone transmission did not significantly affect the accuracy of the ECG interpretation. The centralised computer programme could be much more widely used to provide ECG interpretations. A telephone link would not be justified in health centres, where the demand for ECGs is fairly small, but ECGs recorded at a health centre can be sent to the computer for interpretation and returned the next day. The most cost-effective method of providing computer interpretation for several health centres in a large city would be to have a portable electrocardiograph and transmission facilities, which could be moved from centre to centre. PMID:319866

  5. Educational Statistics.

    ERIC Educational Resources Information Center

    Penfield, Douglas A.

    The 30 papers in the area of educational statistics that were presented at the 1972 AERA Conference are reviewed. The papers are categorized into five broad areas of interest: (1) theory of univariate analysis, (2) nonparametric methods, (3) regression-prediction theory, (4) multivariable methods, and (5) factor analysis. A list of the papers…

  6. Statistics Revelations

    ERIC Educational Resources Information Center

    Chicot, Katie; Holmes, Hilary

    2012-01-01

    The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…

  7. Theory Interpretations in PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)

    2001-01-01

    The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.

  8. Smartphones for post-event analysis: a low-cost and easily accessible approach for mapping natural hazards

    NASA Astrophysics Data System (ADS)

    Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo

    2015-04-01

    A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar technology. Anyone (included farmers, technicians or who work at Civil Protection) who has a good smartphone can take photographs and, from these photographs, they can easily obtain high-resolution DSMs. Therefore, SfM technique accomplished with smartphones can be a very strategic tool for post-event field surveys, to increase the existing knowledge on such events, and to provide fast technical solutions for risk mitigation (e.g. landslide and flood risk management). The future challenge consists of using only a smartphone for local scale post-event analyses. This can be even enhanced by the development of specific apps that are able to build quickly a 3D view of the case study and arrange a preliminary quantitative analysis of the process involved, ready to be sent to Civil Protection for further elaborations. Tarolli, P. (2014). High-resolution topography for understanding Earth surface processes: opportunities and challenges. Geomorphology, 216, 295-312, doi:10.1016/j.geomorph.2014.03.008.

  9. Double copper sheath multiconductor instrumentation cable is durable and easily installed in high thermal or nuclear radiation area

    NASA Technical Reports Server (NTRS)

    Mc Crae, A. W., Jr.

    1967-01-01

    Multiconductor instrumentation cable in which the conducting wires are routed through two concentric copper tube sheaths, employing a compressed insulator between the conductors and between the inner and outer sheaths, is durable and easily installed in high thermal or nuclear radiation area. The double sheath is a barrier against moisture, abrasion, and vibration.

  10. Statistics Poster Challenge for Schools

    ERIC Educational Resources Information Center

    Payne, Brad; Freeman, Jenny; Stillman, Eleanor

    2013-01-01

    The analysis and interpretation of data are important life skills. A poster challenge for schoolchildren provides an innovative outlet for these skills and demonstrates their relevance to daily life. We discuss our Statistics Poster Challenge and the lessons we have learned.

  11. Singular statistics.

    PubMed

    Bogomolny, E; Gerland, U; Schmit, C

    2001-03-01

    We consider the statistical distribution of zeros of random meromorphic functions whose poles are independent random variables. It is demonstrated that correlation functions of these zeros can be computed analytically, and explicit calculations are performed for the two-point correlation function. This problem naturally appears in, e.g., rank-1 perturbation of an integrable Hamiltonian and, in particular, when a delta-function potential is added to an integrable billiard. PMID:11308740

  12. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  13. Easily repairable networks

    NASA Astrophysics Data System (ADS)

    Fink, Thomas

    2015-03-01

    We introduce a simple class of distribution networks which withstand damage by being repairable instead of redundant. Instead of asking how hard it is to disconnect nodes through damage, we ask how easy it is to reconnect nodes after damage. We prove that optimal networks on regular lattices have an expected cost of reconnection proportional to the lattice length, and that such networks have exactly three levels of structural hierarchy. We extend our results to networks subject to repeated attacks, in which the repairs themselves must be repairable. We find that, in exchange for a modest increase in repair cost, such networks are able to withstand any number of attacks. We acknowledge support from the Defense Threat Reduction Agency, BCG and EU FP7 (Growthcom).

  14. Programs for Training Interpreters.

    ERIC Educational Resources Information Center

    American Annals of the Deaf, 2003

    2003-01-01

    This listing provides directory information on U.S. programs for training interpreters for individuals with deafness. Schools are listed by state and include director and degree information. (Author/CR)

  15. Interpretation of Bernoulli's Equation.

    ERIC Educational Resources Information Center

    Bauman, Robert P.; Schwaneberg, Rolf

    1994-01-01

    Discusses Bernoulli's equation with regards to: horizontal flow of incompressible fluids, change of height of incompressible fluids, gases, liquids and gases, and viscous fluids. Provides an interpretation, properties, terminology, and applications of Bernoulli's equation. (MVL)

  16. Interpretation of Biosphere Reserves.

    ERIC Educational Resources Information Center

    Merriman, Tim

    1994-01-01

    Introduces the Man and the Biosphere Programme (MAB) to monitor the 193 biogeographical provinces of the Earth and the creation of biosphere reserves. Highlights the need for interpreters to become familiar or involved with MAB program activities. (LZ)

  17. Interpreter-mediated dentistry.

    PubMed

    Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F

    2015-05-01

    The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals. PMID:25828074

  18. Customizable tool for ecological data entry, assessment, monitoring, and interpretation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...

  19. Interpreting the radon transform using Prolog

    NASA Astrophysics Data System (ADS)

    Batchelor, Bruce G.

    1992-03-01

    The Radon transform is an important method for identifying linear features in a digital image. However, the images which the Radon transform generates are complex and require intelligent interpretation, to identify lines in the input image correctly. This article describes how the images can be pre-processed to make the spots in the Radon transform image more easily identified and describes Prolog programs which can recognize constellations of points in the Radon transform image and thereby identify geometric figures within the input image.

  20. Nationally consistent and easily-implemented approach to evaluate littoral-riparian habitat quality in lakes and reservoirs

    EPA Science Inventory

    The National Lakes Assessment (NLA) and other lake survey and monitoring efforts increasingly rely upon biological assemblage data to define lake condition. Information concerning the multiple dimensions of physical and chemical habitat is necessary to interpret this biological ...

  1. Modification of codes NUALGAM and BREMRAD. Volume 3: Statistical considerations of the Monte Carlo method

    NASA Technical Reports Server (NTRS)

    Firstenberg, H.

    1971-01-01

    The statistics are considered of the Monte Carlo method relative to the interpretation of the NUGAM2 and NUGAM3 computer code results. A numerical experiment using the NUGAM2 code is presented and the results are statistically interpreted.

  2. Geological interpretation of a Gemini photo

    USGS Publications Warehouse

    Hemphill, William R.; Danilchik, Walter

    1968-01-01

    Study of the Gemini V photograph of the Salt Range and Potwar Plateau, West Pakistan, indicates that small-scale orbital photographs permit recognition of the regional continuity of some geologic features, particularly faults and folds that could he easily overlooked on conventional air photographs of larger scale. Some stratigraphic relationships can also be recognized on the orbital photograph, but with only minimal previous geologic knowledge of the area, these interpretations are less conclusive or reliable than the interpretation of structure. It is suggested that improved atmospheric penetration could be achieved through the use of color infrared film. Photographic expression of topography could also be improved by deliberately photographing some areas during periods of low sun angle.

  3. Interpreting wireline measurements in coal beds

    SciTech Connect

    Johnston, D.J. )

    1991-06-01

    When logging coal seams with wireline tools, the interpretation method needed to evaluate the coals is different from that used for conventional oil and gas reservoirs. Wireline logs identify coals easily. For an evaluation, the contribution of each coal component on the raw measurements must be considered. This paper will discuss how each log measurement is affected by each component. The components of a coal will be identified as the mineral matter, macerals, moisture content, rank, gas content, and cleat porosity. The measurements illustrated are from the resistivity, litho-density, neutron, sonic, dielectric, and geochemical tools. Once the coal component effects have been determined, an interpretation of the logs can be made. This paper will illustrate how to use these corrected logs in a coal evaluation.

  4. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  5. SOCR: Statistics Online Computational Resource.

    PubMed

    Dinov, Ivo D

    2006-10-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning. PMID:21451741

  6. Ta3N5-Pt nonwoven cloth with hierarchical nanopores as efficient and easily recyclable macroscale photocatalysts

    PubMed Central

    Li, Shijie; Zhang, Lisha; Wang, Huanli; Chen, Zhigang; Hu, Junqing; Xu, Kaibing; Liu, Jianshe

    2014-01-01

    Traditional nanosized photocatalysts usually have high photocatalytic activity but can not be efficiently recycled. Film-shaped photocatalysts on the substrates can be easily recycled, but they have low surface area and/or high production cost. To solve these problems, we report on the design and preparation of efficient and easily recyclable macroscale photocatalysts with nanostructure by using Ta3N5 as a model semiconductor. Ta3N5-Pt nonwoven cloth has been prepared by an electrospinning-calcination-nitridation-wet impregnation method, and it is composed of Ta3N5 fibers with diameter of 150–200 nm and hierarchical pores. Furthermore, these fibers are constructed from Ta3N5 nanoparticles with diameter of ~25 nm which are decorated with Pt nanoparticles with diameter of ~2.5 nm. Importantly, Ta3N5-Pt cloth can be used as an efficient and easily recyclable macroscale photocatalyst with wide visible-light response, for the degradation of methylene blue and parachlorophenol, probably resulting in a very promising application as “photocatalyst dam” for the polluted river. PMID:24496147

  7. Hold My Calls: An Activity for Introducing the Statistical Process

    ERIC Educational Resources Information Center

    Abel, Todd; Poling, Lisa

    2015-01-01

    Working with practicing teachers, this article demonstrates, through the facilitation of a statistical activity, how to introduce and investigate the unique qualities of the statistical process including: formulate a question, collect data, analyze data, and interpret data.

  8. Interpreting Astronomical Spectra

    NASA Astrophysics Data System (ADS)

    Emerson, D.

    1996-06-01

    Interpreting Astronomical Spectra D. Emerson Institute for Astronomy, Department of Physics and Astronomy, The University of Edingurgh "Interpreting Astronomical Spectra" describes how physical conditions such as temperature, density and composition can be obtained from the spectra of a broad range of astronomical environments ranging from the cold interstellar medium to very hot coronal gas and from stellar atmospheres to quasars. In this book the author has succeeded in providing a coherent and integrated approach to the interpretation of astronomical spectroscopy, placing the emphasis on the physical understanding of spectrum formation rather than on instrumental considerations. MKS units and consistent symbols are employed throughout so that the fundamental ideas common to diverse environments are made clear and the importance of different temperature ranges and densities can be seen. Aimed at senior undergraduates and graduates studying physics, astronomy and astrophysics, this book will also appeal to the professional astronomer.

  9. Interpreting Astronomical Spectra

    NASA Astrophysics Data System (ADS)

    Emerson, D.

    1999-03-01

    Interpreting Astronomical Spectra D. Emerson Institute for Astronomy, Department of Physics and Astronomy, The University of Edingurgh "Interpreting Astronomical Spectra" describes how physical conditions such as temperature, density and composition can be obtained from the spectra of a broad range of astronomical environments ranging from the cold interstellar medium to very hot coronal gas and from stellar atmospheres to quasars. In this book the author has succeeded in providing a coherent and integrated approach to the interpretation of astronomical spectroscopy, placing the emphasis on the physical understanding of spectrum formation rather than on instrumental considerations. MKS units and consistent symbols are employed throughout so that the fundamental ideas common to diverse environments are made clear and the importance of different temperature ranges and densities can be seen. Aimed at senior undergraduates and graduates studying physics, astronomy and astrophysics, this book will also appeal to the professional astronomer.

  10. Copenhagen and Transactional Interpretations

    NASA Astrophysics Data System (ADS)

    Grnitz, Th.; von Weizscker, C. F.

    1988-02-01

    The Copenhagen interpretation (CI) never received an authoritative codification. It was a minimum semantics of quantum mechanics. We assume that it expresses a theory identical with the Transactional Interpretation (TI) when the observer is included into the system described by the theory. A theory consists of a mathematical structure with a physical semantics. Now, CI rests on an implicit description of the modes of time which is also presupposed by the Second Law of Thermodynamics. Essential is the futuric meaning of probability as a prediction of a relative frequency. CI can be shown to be fully consistent on this basis. The TI and CI can be translated into each other by a simple dictionary. The TI describes all events as CI describes past events; CI calls future events possibilities, which TI treats like facts. All predictions of both interpretations agree; we suppose the difference to be linguistic.

  11. Considerations When Working with Interpreters.

    ERIC Educational Resources Information Center

    Hwa-Froelich, Deborah A.; Westby, Carol E.

    2003-01-01

    This article describes the current training and certification procedures in place for linguistic interpreters, the continuum of interpreter roles, and how interpreters' perspectives may influence the interpretive interaction. The specific skills needed for interpreting in either health care or educational settings are identified. A table compares…

  12. The ADAMS interactive interpreter

    SciTech Connect

    Rietscha, E.R.

    1990-12-17

    The ADAMS (Advanced DAta Management System) project is exploring next generation database technology. Database management does not follow the usual programming paradigm. Instead, the database dictionary provides an additional name space environment that should be interactively created and tested before writing application code. This document describes the implementation and operation of the ADAMS Interpreter, an interactive interface to the ADAMS data dictionary and runtime system. The Interpreter executes individual statements of the ADAMS Interface Language, providing a fast, interactive mechanism to define and access persistent databases. 5 refs.

  13. Social Maladjustment: An Interpretation.

    ERIC Educational Resources Information Center

    Center, David B.

    The exclusionary term, "social maladjustment," the definition in Public Law 94-142 (the Education for All Handicapped Children Act) of serious emotional disturbance, has been an enigma for special education. This paper attempts to limit the interpretation of social maladjustment in order to counter effects of such decisions as "Honig vs. Doe" in…

  14. Deafness and Interpreting.

    ERIC Educational Resources Information Center

    New Jersey State Dept. of Labor, Trenton. Div. of the Deaf.

    This paper explains how the hearing loss of deaf persons affects communication, describes methods deaf individuals use to communicate, and addresses the role of interpreters in the communication process. The volume covers: communication methods such as speechreading or lipreading, written notes, gestures, or sign language (American Sign Language,

  15. Interpreting & Biomechanics. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    PEPNet-Northeast, 2001

    2001-01-01

    Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…

  16. Fractal interpretation of intermittency

    SciTech Connect

    Hwa, R.C.

    1991-12-01

    Implication of intermittency in high-energy collisions is first discussed. Then follows a description of the fractal interpretation of intermittency. A basic quantity with asymptotic fractal behavior is introduced. It is then shown how the factorial moments and the G moments can be expressed in terms of it. The relationship between the intermittency indices and the fractal indices is made explicit.

  17. Interpreting the Constitution.

    ERIC Educational Resources Information Center

    Brennan, William J., Jr.

    1987-01-01

    Discusses constitutional interpretations relating to capital punishment and protection of human dignity. Points out the document's effectiveness in creating a new society by adapting its principles to current problems and needs. Considers two views of the Constitution that lead to controversy over the legitimacy of judicial decisions. (PS)

  18. Abstract Interpreters for Free

    NASA Astrophysics Data System (ADS)

    Might, Matthew

    In small-step abstract interpretations, the concrete and abstract semantics bear an uncanny resemblance. In this work, we present an analysis-design methodology that both explains and exploits that resemblance. Specifically, we present a two-step method to convert a small-step concrete semantics into a family of sound, computable abstract interpretations. The first step re-factors the concrete state-space to eliminate recursive structure; this refactoring of the state-space simultaneously determines a store-passing-style transformation on the underlying concrete semantics. The second step uses inference rules to generate an abstract state-space and a Galois connection simultaneously. The Galois connection allows the calculation of the "optimal" abstract interpretation. The two-step process is unambiguous, but nondeterministic: at each step, analysis designers face choices. Some of these choices ultimately influence properties such as flow-, field- and context-sensitivity. Thus, under the method, we can give the emergence of these properties a graph-theoretic characterization. To illustrate the method, we systematically abstract the continuation-passing style lambda calculus to arrive at two distinct families of analyses. The first is the well-known k-CFA family of analyses. The second consists of novel "environment-centric" abstract interpretations, none of which appear in the literature on static analysis of higher-order programs.

  19. Translation and Interpretation.

    ERIC Educational Resources Information Center

    Richardson, Ian M.

    1989-01-01

    An examination of the logic of second language processing argues that knowledge of language structure does not necessarily result in effective language comprehension. Teachers must help students to bridge the gap between translation, which emphasizes lexical and syntactic meaning, and interpretation, which involves global comprehension. (CB)

  20. The interpretation of fuzziness.

    PubMed

    Wang, P

    1996-01-01

    By analyzing related issues in psychology and linguistics, two basic types of fuzziness can be attributed to similarity and relativity, respectively. In both cases, it is possible to interpret grade of membership as the proportion of positive evidence, so as to treat fuzziness and randomness uniformly. PMID:18263034

  1. Tokens: Facts and Interpretation.

    ERIC Educational Resources Information Center

    Schmandt-Besserat, Denise

    1986-01-01

    Summarizes some of the major pieces of evidence concerning the archeological clay tokens, specifically the technique for their manufacture, their geographic distribution, chronology, and the context in which they are found. Discusses the interpretation of tokens as the first example of visible language, particularly as an antecedent of Sumerian…

  2. Listening and Message Interpretation

    ERIC Educational Resources Information Center

    Edwards, Renee

    2011-01-01

    Message interpretation, the notion that individuals assign meaning to stimuli, is related to listening presage, listening process, and listening product. As a central notion of communication, meaning includes (a) denotation and connotation, and (b) content and relational meanings, which can vary in ambiguity and vagueness. Past research on message…

  3. Interpreting & Biomechanics. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    PEPNet-Northeast, 2001

    2001-01-01

    Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint

  4. INCREASING SCIENTIFIC POWER WITH STATISTICAL POWER

    EPA Science Inventory

    A brief survey of basic ideas in statistical power analysis demonstrates the advantages and ease of using power analysis throughout the design, analysis, and interpretation of research. he power of a statistical test is the probability of rejecting the null hypothesis of the test...

  5. Screencast Tutorials Enhance Student Learning of Statistics

    ERIC Educational Resources Information Center

    Lloyd, Steven A.; Robertson, Chuck L.

    2012-01-01

    Although the use of computer-assisted instruction has rapidly increased, there is little empirical research evaluating these technologies, specifically within the context of teaching statistics. The authors assessed the effect of screencast tutorials on learning outcomes, including statistical knowledge, application, and interpretation. Students…

  6. Graphs and Statistics: A Resource Handbook.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of General Education Curriculum Development.

    Graphical representation of statistical data is the focus of this resource handbook. Only graphs which present numerical information are discussed. Activities involving the making, interpreting, and use of various types of graphs and tables are included. Sections are also included which discuss statistical terms, normal distribution and…

  7. Highly concentrated synthesis of copper-zinc-tin-sulfide nanocrystals with easily decomposable capping molecules for printed photovoltaic applications

    NASA Astrophysics Data System (ADS)

    Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho

    2013-10-01

    Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination.Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination. Electronic supplementary information (ESI) available: Experimental methods for CZTS nanocrystal synthesis, device fabrication, and characterization; the size distribution and energy dispersive X-ray (EDX) spectra of the synthesized CZTS nanoparticles; UV-vis spectra of the CZTS films; isothermal analysis of triphenylphosphate (TPP) and oleylamine (OLA); microstructural SEM images of annealed CZTS nanocrystal films. See DOI: 10.1039/c3nr03104g

  8. Sorting chromatic sextupoles for easily and effectively correcting second order chromaticity in the Relativistic Heavy Ion Collider

    SciTech Connect

    Luo,Y.; Tepikian, S.; Fischer, W.; Robert-Demolaize, G.; Trbojevic, D.

    2009-01-02

    Based on the contributions of the chromatic sextupole families to the half-integer resonance driving terms, we discuss how to sort the chromatic sextupoles in the arcs of the Relativistic Heavy Ion Collider (RHIC) to easily and effectively correct the second order chromaticities. We propose a method with 4 knobs corresponding to 4 pairs of chromatic sextupole families to online correct the second order chromaticities. Numerical simulation justifies this method, showing that this method reduces the unbalance in the correction strengths of sextupole families and avoids the reversal of sextupole polarities. Therefore, this method yields larger dynamic apertures for the proposed RHIC 2009 100GeV polarized proton run lattices.

  9. Synthesis, characterization and application of water-soluble and easily removable cationic pressure-sensitive adhesives. Quarterly technical report

    SciTech Connect

    1999-09-30

    The Institute studied the adsorption of cationic pressure-sensitive adhesive (PSA) on wood fiber, and the buildup of PSA in a closed water system during paper recycling; the results are presented. Georgia Tech worked to develop an environmentally friendly polymerization process to synthesize a novel re-dispersible PSA by co-polymerizing an oil-soluble monomer (butyl acrylate) and a cationic monomer MAEPTAC; results are presented. At the University of Georgia at Athens the project focused on the synthesis of water-soluble and easily removable cationic polymer PSAs.

  10. Statistical ecology comes of age.

    PubMed

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  11. Statistical ecology comes of age

    PubMed Central

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  12. Linking numbers, spin, and statistics of solitons

    NASA Technical Reports Server (NTRS)

    Wilczek, F.; Zee, A.

    1983-01-01

    The spin and statistics of solitons in the (2 + 1)- and (3 + 1)-dimensional nonlinear sigma models is considered. For the (2 + 1)-dimensional case, there is the possibility of fractional spin and exotic statistics; for 3 + 1 dimensions, the usual spin-statistics relation is demonstrated. The linking-number interpretation of the Hopf invariant and the use of suspension considerably simplify the analysis.

  13. Development and validation of a quick easily used biochemical assay for evaluating the viability of small immobile arthropods.

    PubMed

    Phillips, Craig B; Iline, Ilia I; Richards, Nicola K; Novoselov, Max; McNeill, Mark R

    2013-10-01

    Quickly, accurately, and easily assessing the efficacy of treatments to control sessile arthropods (e.g., scale insects) and stationary immature life stages (e.g., eggs and pupae) is problematic because it is difficult to tell whether treated organisms are alive or dead. Current approaches usually involve either maintaining organisms in the laboratory to observe them for development, gauging their response to physical stimulation, or assessing morphological characters such as turgidity and color. These can be slow, technically difficult, or subjective, and the validity of methods other than laboratory rearing has seldom been tested. Here, we describe development and validation of a quick easily used biochemical colorimetric assay for measuring the viability of arthropods that is sufficiently sensitive to test even very small organisms such as white fly eggs. The assay was adapted from a technique for staining the enzyme hexokinase to signal the presence of adenosine triphosphate in viable specimens by reducing a tetrazolium salt to formazan. Basic laboratory facilities and skills are required for production of the stain, but no specialist equipment, expertise, or facilities are needed for its use. PMID:24224241

  14. Hemoglobin levels and circulating blasts are two easily evaluable diagnostic parameters highly predictive of leukemic transformation in primary myelofibrosis.

    PubMed

    Rago, Angela; Latagliata, Roberto; Montanaro, Marco; Montefusco, Enrico; Andriani, Alessandro; Crescenzi, Sabrina Leonetti; Mecarocci, Sergio; Spirito, Francesca; Spadea, Antonio; Recine, Umberto; Cicconi, Laura; Avvisati, Giuseppe; Cedrone, Michele; Breccia, Massimo; Porrini, Raffaele; Villivà, Nicoletta; De Gregoris, Cinzia; Alimena, Giuliana; D'Arcangelo, Enzo; Guglielmelli, Paola; Lo-Coco, Francesco; Vannucchi, Alessandro; Cimino, Giuseppe

    2015-03-01

    To predict leukemic transformation (LT), we evaluated easily detectable diagnostic parameters in 338 patients with primary myelofibrosis (PMF) followed in the Latium region (Italy) between 1981 and 2010. Forty patients (11.8%) progressed to leukemia, with a resulting 10-year leukemia-free survival (LFS) rates of 72%. Hb (<10g/dL), and circulating blasts (≥1%) were the only two independent prognostic for LT at the multivariate analysis. Two hundred-fifty patients with both the two parameters available were grouped as follows: low risk (none or one factor)=216 patients; high risk (both factors)=31 patients. The median LFS times were 269 and 45 months for the low and high-risk groups, respectively (P<.0001). The LT predictive power of these two parameters was confirmed in an external series of 270 PMF patients from Tuscany, in whom the median LFS was not reached and 61 months for the low and high risk groups, respectively (P<.0001). These results establish anemia and circulating blasts, two easily and universally available parameters, as strong predictors of LT in PMF and may help to improve prognostic stratification of these patients particularly in countries with low resources where more sophisticated molecular testing is unavailable. PMID:25636356

  15. Semantic interpretation of nominalizations

    SciTech Connect

    Hull, R.D.; Gomez, F.

    1996-12-31

    A computational approach to the semantic interpretation of nominalizations is described. Interpretation of normalizations involves three tasks: deciding whether the normalization is being used in a verbal or non-verbal sense; disambiguating the normalized verb when a verbal sense is used; and determining the fillers of the thematic roles of the verbal concept or predicate of the nominalization. A verbal sense can be recognized by the presence of modifiers that represent the arguments of the verbal concept. It is these same modifiers which provide the semantic clues to disambiguate the normalized verb. In the absence of explicit modifiers, heuristics are used to discriminate between verbal and non-verbal senses. A correspondence between verbs and their nominalizations is exploited so that only a small amount of additional knowledge is needed to handle the nominal form. These methods are tested in the domain of encyclopedic texts and the results are shown.

  16. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on. PMID:25090127

  17. Tips for Mental Health Interpretation

    ERIC Educational Resources Information Center

    Whitsett, Margaret

    2008-01-01

    This paper offers tips for working with interpreters in mental health settings. These tips include: (1) Using trained interpreters, not bilingual staff or community members; (2) Explaining "interpreting procedures" to the providers and clients; (3) Addressing the stigma associated with mental health that may influence interpreters; (4) Defining…

  18. Data Interpretation: Using Probability

    ERIC Educational Resources Information Center

    Drummond, Gordon B.; Vowler, Sarah L.

    2011-01-01

    Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…

  19. Cancer Survival: An Overview of Measures, Uses, and Interpretation

    PubMed Central

    Noone, Anne-Michelle; Howlader, Nadia; Cho, Hyunsoon; Keel, Gretchen E.; Garshell, Jessica; Woloshin, Steven; Schwartz, Lisa M.

    2014-01-01

    Survival statistics are of great interest to patients, clinicians, researchers, and policy makers. Although seemingly simple, survival can be confusing: there are many different survival measures with a plethora of names and statistical methods developed to answer different questions. This paper aims to describe and disseminate different survival measures and their interpretation in less technical language. In addition, we introduce templates to summarize cancer survival statistic organized by their specific purpose: research and policy versus prognosis and clinical decision making. PMID:25417231

  20. FIDEA: a server for the functional interpretation of differential expression analysis

    PubMed Central

    D’Andrea, Daniel; Grassi, Luigi; Mazzapioda, Mariagiovanna; Tramontano, Anna

    2013-01-01

    The results of differential expression analyses provide scientists with hundreds to thousands of differentially expressed genes that need to be interpreted in light of the biology of the specific system under study. This requires mapping the genes to functional classifications that can be, for example, the KEGG pathways or InterPro families they belong to, their GO Molecular Function, Biological Process or Cellular Component. A statistically significant overrepresentation of one or more category terms in the set of differentially expressed genes is an essential step for the interpretation of the biological significance of the results. Ideally, the analysis should be performed by scientists who are well acquainted with the biological problem, as they have a wealth of knowledge about the system and can, more easily than a bioinformatician, discover less obvious and, therefore, more interesting relationships. To allow experimentalists to explore their data in an easy and at the same time exhaustive fashion within a single tool and to test their hypothesis quickly and effortlessly, we developed FIDEA. The FIDEA server is located at http://www.biocomputing.it/fidea; it is free and open to all users, and there is no login requirement. PMID:23754850

  1. LACIE analyst interpretation keys

    NASA Technical Reports Server (NTRS)

    Baron, J. G.; Payne, R. W.; Palmer, W. F. (principal investigators)

    1979-01-01

    Two interpretation aids, 'The Image Analysis Guide for Wheat/Small Grains Inventories' and 'The United States and Canadian Great Plains Regional Keys', were developed during LACIE phase 2 and implemented during phase 3 in order to provide analysts with a better understanding of the expected ranges in color variation of signatures for individual biostages and of the temporal sequences of LANDSAT signatures. The keys were tested using operational LACIE data, and the results demonstrate that their use provides improved labeling accuracy in all analyst experience groupings, in all geographic areas within the U.S. Great Plains, and during all periods of crop development.

  2. Cosmetic Plastic Surgery Statistics

    MedlinePlus

    2014 Cosmetic Plastic Surgery Statistics Cosmetic Procedure Trends 2014 Plastic Surgery Statistics Report Please credit the AMERICAN SOCIETY OF PLASTIC SURGEONS when citing statistical data or using ...

  3. Statistical Modeling of SAR Images: A Survey

    PubMed Central

    Gao, Gui

    2010-01-01

    Statistical modeling is essential to SAR (Synthetic Aperture Radar) image interpretation. It aims to describe SAR images through statistical methods and reveal the characteristics of these images. Moreover, statistical modeling can provide a technical support for a comprehensive understanding of terrain scattering mechanism, which helps to develop algorithms for effective image interpretation and creditable image simulation. Numerous statistical models have been developed to describe SAR image data, and the purpose of this paper is to categorize and evaluate these models. We first summarize the development history and the current researching state of statistical modeling, then different SAR image models developed from the product model are mainly discussed in detail. Relevant issues are also discussed. Several promising directions for future research are concluded at last. PMID:22315568

  4. Analyzing spike trains with circular statistics

    NASA Astrophysics Data System (ADS)

    Takeshita, Daisuke; Gale, John T.; Montgomery, Erwin B.; Bahar, Sonya; Moss, Frank

    2009-05-01

    In neuroscience, specifically electrophysiology, it is common to replace a measured sequence of action potentials or spike trains with delta functions prior to analysis. We apply a method called circular statistics to a time series of delta functions and show that the method is equivalent to the power spectrum. This technique allows us to easily visualize the idea of the power spectrum of spike trains and easily reveals oscillatory and stochastic behavior. We provide several illustrations of the method and an example suitable for students, and suggest that the method might be useful for courses in introductory biophysics and neuroscience.

  5. Physical interpretation of antigravity

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; James, Albin

    2016-02-01

    Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl-invariant Standard Model coupled to general relativity (SM +GR ), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated, but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and out signals that interact with the antigravity region. This is no different from a spacetime black box for which the information about its interior is encoded in scattering amplitudes for in/out states at its exterior. Through examples we show that negative kinetic energy in antigravity presents no problems of principles but is an interesting topic for physical investigations of fundamental significance.

  6. Structural interpretation of seismic data and inherent uncertainties

    NASA Astrophysics Data System (ADS)

    Bond, Clare

    2013-04-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.

  7. Model averaging methods to merge operational statistical and dynamic seasonal streamflow forecasts in Australia

    NASA Astrophysics Data System (ADS)

    Schepen, Andrew; Wang, Q. J.

    2015-03-01

    The Australian Bureau of Meteorology produces statistical and dynamic seasonal streamflow forecasts. The statistical and dynamic forecasts are similarly reliable in ensemble spread; however, skill varies by catchment and season. Therefore, it may be possible to optimize forecasting skill by weighting and merging statistical and dynamic forecasts. Two model averaging methods are evaluated for merging forecasts for 12 locations. The first method, Bayesian model averaging (BMA), applies averaging to forecast probability densities (and thus cumulative probabilities) for a given forecast variable value. The second method, quantile model averaging (QMA), applies averaging to forecast variable values (quantiles) for a given cumulative probability (quantile fraction). BMA and QMA are found to perform similarly in terms of overall skill scores and reliability in ensemble spread. Both methods improve forecast skill across catchments and seasons. However, when both the statistical and dynamical forecasting approaches are skillful but produce, on special occasions, very different event forecasts, the BMA merged forecasts for these events can have unusually wide and bimodal distributions. In contrast, the distributions of the QMA merged forecasts for these events are narrower, unimodal and generally more smoothly shaped, and are potentially more easily communicated to and interpreted by the forecast users. Such special occasions are found to be rare. However, every forecast counts in an operational service, and therefore the occasional contrast in merged forecasts between the two methods may be more significant than the indifference shown by the overall skill and reliability performance.

  8. Monitoring and interpreting bioremediation effectiveness

    SciTech Connect

    Bragg, J.R.; Prince, R.C.; Harner, J.; Atlas, R.M.

    1993-12-31

    Following the Exxon Valdez oil spill in 1989, extensive research was conducted by the US Environments Protection Agency and Exxon to develop and implement bioremediation techniques for oil spill cleanup. A key challenge of this program was to develop effective methods for monitoring and interpreting bioremediation effectiveness on extremely heterogenous intertidal shorelines. Fertilizers were applied to shorelines at concentrations known to be safe, and effectiveness achieved in acceleration biodegradation of oil residues was measure using several techniques. This paper describes the most definitive method identified, which monitors biodegradation loss by measuring changes in ratios of hydrocarbons to hopane, a cycloalkane present in the oil that showed no measurable degradation. Rates of loss measured by the hopane ratio method have high levels of statistical confidence, and show that the fertilizer addition stimulated biodegradation rates as much a fivefold. Multiple regression analyses of data show that fertilizer addition of nitrogen in interstitial pore water per unit of oil load was the most important parameter affecting biodegradation rate, and results suggest that monitoring nitrogen concentrations in the subsurface pore water is preferred technique for determining fertilizer dosage and reapplication frequency.

  9. Statistical analysis of arthroplasty data

    PubMed Central

    2011-01-01

    It is envisaged that guidelines for statistical analysis and presentation of results will improve the quality and value of research. The Nordic Arthroplasty Register Association (NARA) has therefore developed guidelines for the statistical analysis of arthroplasty register data. The guidelines are divided into two parts, one with an introduction and a discussion of the background to the guidelines (Ranstam et al. 2011a, see pages x-y in this issue), and this one with a more technical statistical discussion on how specific problems can be handled. This second part contains (1) recommendations for the interpretation of methods used to calculate survival, (2) recommendations on howto deal with bilateral observations, and (3) a discussion of problems and pitfalls associated with analysis of factors that influence survival or comparisons between outcomes extracted from different hospitals. PMID:21619500

  10. Easily accessible polymer additives for tuning the crystal-growth of perovskite thin-films for highly efficient solar cells

    NASA Astrophysics Data System (ADS)

    Dong, Qingqing; Wang, Zhaowei; Zhang, Kaicheng; Yu, Hao; Huang, Peng; Liu, Xiaodong; Zhou, Yi; Chen, Ning; Song, Bo

    2016-03-01

    For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs.For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs. Electronic supplementary information (ESI) available: J-V curves & characteristics of Pero-SCs, UV-vis spectra and AFM images. See DOI: 10.1039/c6nr00206d

  11. A synthetic interpretation: the double-preparation theory

    NASA Astrophysics Data System (ADS)

    Gondran, Michel; Gondran, Alexandre

    2014-12-01

    In the 1927 Solvay conference, three apparently irreconcilable interpretations of the quantum mechanics wave function were presented: the pilot-wave interpretation by de Broglie, the soliton wave interpretation by Schrödinger and the Born statistical rule by Born and Heisenberg. In this paper, we demonstrate the complementarity of these interpretations corresponding to quantum systems that are prepared differently and we deduce a synthetic interpretation: the double-preparation theory. We first introduce in quantum mechanics the concept of semi-classical statistically prepared particles, and we show that in the Schrödinger equation these particles converge, when h\\to 0, to the equations of a statistical set of classical particles. These classical particles are undiscerned, and if we assume continuity between classical mechanics and quantum mechanics, we conclude the necessity of the de Broglie-Bohm interpretation for the semi-classical statistically prepared particles (statistical wave). We then introduce in quantum mechanics the concept of a semi-classical deterministically prepared particle, and we show that in the Schrödinger equation this particle converges, when h\\to 0, to the equations of a single classical particle. This classical particle is discerned and assuming continuity between classical mechanics and quantum mechanics, we conclude the necessity of the Schrödinger interpretation for the semi-classical deterministically prepared particle (the soliton wave). Finally we propose, in the semi-classical approximation, a new interpretation of quantum mechanics, the ‘theory of the double preparation’, which depends on the preparation of the particles.

  12. Enhancing the Teaching of Statistics: Portfolio Theory, an Application of Statistics in Finance

    ERIC Educational Resources Information Center

    Christou, Nicolas

    2008-01-01

    In this paper we present an application of statistics using real stock market data. Most, if not all, students have some familiarity with the stock market (or at least they have heard about it) and therefore can understand the problem easily. It is the real data analysis that students find interesting. Here we explore the building of efficient

  13. Enhancing the Teaching of Statistics: Portfolio Theory, an Application of Statistics in Finance

    ERIC Educational Resources Information Center

    Christou, Nicolas

    2008-01-01

    In this paper we present an application of statistics using real stock market data. Most, if not all, students have some familiarity with the stock market (or at least they have heard about it) and therefore can understand the problem easily. It is the real data analysis that students find interesting. Here we explore the building of efficient…

  14. Reverse Causation and the Transactional Interpretation of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Cramer, John G.

    2006-10-01

    In the first part of the paper we present the transactional interpretation of quantum mechanics, a method of viewing the formalism of quantum mechanics that provides a way of visualizing quantum events and experiments. In the second part, we present an EPR gedankenexperiment that appears to lead to observer-level reverse causation. A transactional analysis of the experiment is presented. It easily accounts for the reported observations but does not reveal any barriers to its modification for reverse causation.

  15. Improve MWD data interpretation

    SciTech Connect

    Santley, D.J.; Ardrey, W.E.

    1987-01-01

    This article reports that measurement-while-drilling (MWD) technology is being used today in a broad range of real-time drilling applications. In its infancy, MWD was limited to providing directional survey and steering information. Today, the addition of formation sensors (resistivity, gamma) and drilling efficiency sensors (WOB, torque) has made MWD a much more useful drilling decision tool. In the process, the desirability of combining downhole MWD data with powerful analytical software and interpretive techniques has been recognized by both operators and service companies. However, the usual form in which MWD and wellsite analytical capabilities are combined leaves much to be desired. The most common approach is to incorporate MWD with large-scale computerized mud logging (CML) systems. Essentially, MWD decoding and display equipment is added to existing full-blown CML surface units.

  16. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    SciTech Connect

    Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  17. Interpretation and application of borehole televiewer surveys

    SciTech Connect

    Taylor, T.J.

    1983-01-01

    A borehole televiewer log is comparable to a picture of a continuous core and may yield even more information since it is a picture of the cores host environment; i.e., the inside of the borehole as it exists in the subsurface. Important relationships are preserved which can be lost when cores are brought to the surface. Fractures, bedding planes, vugs, and lithology changes are identifiable on borehole televiewer logs. The travel time of the signal from the sonde to the borehole wall and back to the sonde recently has been used to form a second log: the transit time log. Interpretation problems due to noncircular borehole and eccentered logging sondes are easily overcome using the combination of amplitude and transit time logs. Examples are given to demonstrate potential use.

  18. Easily accessible polymer additives for tuning the crystal-growth of perovskite thin-films for highly efficient solar cells.

    PubMed

    Dong, Qingqing; Wang, Zhaowei; Zhang, Kaicheng; Yu, Hao; Huang, Peng; Liu, Xiaodong; Zhou, Yi; Chen, Ning; Song, Bo

    2016-03-01

    For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs. PMID:26887633

  19. Easily implementable field programmable gate array-based adaptive optics system with state-space multichannel control.

    PubMed

    Chang, Chia-Yuan; Ke, Bo-Ting; Su, Hung-Wei; Yen, Wei-Chung; Chen, Shean-Jen

    2013-09-01

    In this paper, an easily implementable adaptive optics system (AOS) based on a real-time field programmable gate array (FPGA) platform with state-space multichannel control programmed by LabVIEW has been developed, and also integrated into a laser focusing system successfully. To meet the requirements of simple programming configuration and easy integration with other devices, the FPGA-based AOS introduces a standard operation procedure including AOS identification, computation, and operation. The overall system with a 32-channel driving signal for a deformable mirror (DM) as input and a Zernike polynomial via a lab-made Shack-Hartmann wavefront sensor (SHWS) as output is optimally identified to construct a multichannel state-space model off-line. In real-time operation, the FPGA platform first calculates the Zernike polynomial of the optical wavefront measured from the SHWS as the feedback signal. Then, a state-space multichannel controller according to the feedback signal and the identified model is designed and implemented in the FPGA to drive the DM for phase distortion compensation. The current FPGA-based AOS is capable of suppressing low-frequency thermal disturbances with a steady-state phase error of less than 0.1 π within less than 10 time steps when the control loop is operated at a frequency of 30 Hz. PMID:24089871

  20. Changing Preservice Science Teachers' Views of Nature of Science: Why Some Conceptions May be More Easily Altered than Others

    NASA Astrophysics Data System (ADS)

    Mesci, Gunkut; Schwartz, Renee'S.

    2016-02-01

    The purpose of this study was to assess preservice teachers' views of Nature of Science (NOS), identify aspects that were challenging for conceptual change, and explore reasons why. This study particularly focused on why and how some concepts of NOS may be more easily altered than others. Fourteen preservice science teachers enrolled in a NOS and Science Inquiry course participated in this study. Data were collected by using a pre/post format with the Views of Nature of Science questionnaire (VNOS-270), the Views of Scientific Inquiry questionnaire (VOSI-270), follow-up interviews, and classroom artifacts. The results indicated that most students initially held naïve views about certain aspects of NOS like tentativeness and subjectivity. By the end of the semester, almost all students dramatically improved their understanding about almost all aspects of NOS. However, several students still struggled with certain aspects like the differences between scientific theory and law, tentativeness, and socio-cultural embeddedness. Results suggested that instructional, motivational, and socio-cultural factors may influence if and how students changed their views about targeted NOS aspects. Students thought that classroom activities, discussions, and readings were most helpful to improve their views about NOS. The findings from the research have the potential to translate as practical advice for teachers, science educators, and future researchers.

  1. Impact of an easily reducible disulfide bond on the oxidative folding rate of multi-disulfide-containing proteins.

    PubMed

    Leung, H J; Xu, G; Narayan, M; Scheraga, H A

    2005-01-01

    The burial of native disulfide bonds, formed within stable structure in the regeneration of multi-disulfide-containing proteins from their fully reduced states, is a key step in the folding process, as the burial greatly accelerates the oxidative folding rate of the protein by sequestering the native disulfide bonds from thiol-disulfide exchange reactions. Nevertheless, several proteins retain solvent-exposed disulfide bonds in their native structures. Here, we have examined the impact of an easily reducible native disulfide bond on the oxidative folding rate of a protein. Our studies reveal that the susceptibility of the (40-95) disulfide bond of Y92G bovine pancreatic ribonuclease A (RNase A) to reduction results in a reduced rate of oxidative regeneration, compared with wild-type RNase A. In the native state of RNase A, Tyr 92 lies atop its (40-95) disulfide bond, effectively shielding this bond from the reducing agent, thereby promoting protein oxidative regeneration. Our work sheds light on the unique contribution of a local structural element in promoting the oxidative folding of a multi-disulfide-containing protein. PMID:15686534

  2. A History of Oral Interpretation.

    ERIC Educational Resources Information Center

    Bahn, Eugene; Bahn, Margaret L.

    This historical account of the oral interpretation of literature establishes a chain of events comprehending 25 centuries of verbal tradition from the Homeric Age through 20th Century America. It deals in each era with the viewpoints and contributions of major historical figures to oral interpretation, as well as with oral interpretation's…

  3. Rapid Quantitative Analysis of Microcystins in Raw Surface Waters with MALDI MS Utilizing Easily Synthesized Internal Standards

    PubMed Central

    Roegner, Amber F.; Schirmer, Macarena Pírez; Puschner, Birgit; Brena, Beatriz; Gonzalez-Sapienza, Gualberto

    2014-01-01

    The freshwater cyanotoxins, microcystins (MCs), pose a global public health threat as potent hepatotoxins in cyanobacterial blooms; their persistence in drinking and recreational water has been associated with potential chronic effects in addition to acute intoxications. Rapid and accurate detection of the over 80 structural congeners is challenged by the rigorous and time consuming clean up required to overcome interference found in raw water samples. MALDI-MS has shown promise for rapid quantification of individual congeners in raw water samples, with very low operative cost, but so far limited sensitivity and lack of available and versatile internal standards (ISs) has limited its use. Two easily synthesized S-hydroxyethyl–Cys(7)-MC-LR and –RR ISs were used to generate linear standard curves in a reflectron MALDI instrument, reproducible across several orders of magnitude for MC –LR, - RR and –YR. Minimum quantification limits in direct water samples with no clean up or concentration step involved were consistently below 7 μg/L, with recoveries from spiked samples between 80 and 119%. This method improves sensitivity by 30 fold over previous reports of quantitative MALDI-TOF applications to MCs and provides a salient option for rapid throughput analysis for multiple MC congeners in untreated raw surface water blooms as a means to identify source public health threats and target intervention strategies within a watershed. As demonstrated by analysis of a set of samples from Uruguay, utilizing the reaction of different MC congeners with alternate sulfhydryl compounds, the m/z of the IS can be customized to avoid overlap with interfering compounds in local surface water samples. PMID:24388801

  4. Transport of sewage molecular markers through saturated soil column and effect of easily biodegradable primary substrate on their removal.

    PubMed

    Foolad, Mahsa; Ong, Say Leong; Hu, Jiangyong

    2015-11-01

    Pharmaceutical and personal care products (PPCPs) and artificial sweeteners (ASs) are emerging organic contaminants (EOCs) in the aquatic environment. The presence of PPCPs and ASs in water bodies has an ecologic potential risk and health concern. Therefore, it is needed to detect the pollution sources by understanding the transport behavior of sewage molecular markers in a subsurface area. The aim of this study was to evaluate transport of nine selected molecular markers through saturated soil column experiments. The selected sewage molecular markers in this study were six PPCPs including acetaminophen (ACT), carbamazepine (CBZ), caffeine (CF), crotamiton (CTMT), diethyltoluamide (DEET), salicylic acid (SA) and three ASs including acesulfame (ACF), cyclamate (CYC), and saccharine (SAC). Results confirmed that ACF, CBZ, CTMT, CYC and SAC were suitable to be used as sewage molecular markers since they were almost stable against sorption and biodegradation process during soil column experiments. In contrast, transport of ACT, CF and DEET were limited by both sorption and biodegradation processes and 100% removal efficiency was achieved in the biotic column. Moreover, in this study the effect of different acetate concentration (0-100mg/L) as an easily biodegradable primary substrate on a removal of PPCPs and ASs was also studied. Results showed a negative correlation (r(2)>0.75) between the removal of some selected sewage chemical markers including ACF, CF, ACT, CYC, SAC and acetate concentration. CTMT also decreased with the addition of acetate, but increasing acetate concentration did not affect on its removal. CBZ and DEET removal were not dependent on the presence of acetate. PMID:26210019

  5. Input of easily available organic C and N stimulates microbial decomposition of soil organic matter in arctic permafrost soil

    PubMed Central

    Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J. Eloy; Barsukov, Pavel; Bárta, Jiří; Čapek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Šantrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas

    2014-01-01

    Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM (“priming effect”). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze–thaw processes) to additions of 13C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant productivity, can change the decomposition of SOM stored in deeper layers of permafrost soils, with possible repercussions on the global climate. PMID:25089062

  6. Input of easily available organic C and N stimulates microbial decomposition of soil organic matter in arctic permafrost soil.

    PubMed

    Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J Eloy; Barsukov, Pavel; Bárta, Jiří; Capek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Santrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas

    2014-08-01

    Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM ("priming effect"). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze-thaw processes) to additions of (13)C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant productivity, can change the decomposition of SOM stored in deeper layers of permafrost soils, with possible repercussions on the global climate. PMID:25089062

  7. Appropriate use of medical interpreters.

    PubMed

    Juckett, Gregory; Unger, Kendra

    2014-10-01

    More than 25 million Americans speak English "less than very well," according to the U.S. Census Bureau. This population is less able to access health care and is at higher risk of adverse outcomes such as drug complications and decreased patient satisfaction. Title VI of the Civil Rights Act mandates that interpreter services be provided for patients with limited English proficiency who need this service, despite the lack of reimbursement in most states. Professional interpreters are superior to the usual practice of using ad hoc interpreters (i.e., family, friends, or untrained staff). Untrained interpreters are more likely to make errors, violate confidentiality, and increase the risk of poor outcomes. Children should never be used as interpreters except in emergencies. When using an interpreter, the clinician should address the patient directly and seat the interpreter next to or slightly behind the patient. Statements should be short, and the discussion should be limited to three major points. In addition to acting as a conduit for the discussion, the interpreter may serve as a cultural liaison between the physician and patient. When a bilingual clinician or a professional interpreter is not available, phone interpretation services or trained bilingual staff members are reasonable alternatives. The use of professional interpreters (in person or via telephone) increases patient satisfaction, improves adherence and outcomes, and reduces adverse events, thus limiting malpractice risk. PMID:25369625

  8. Predict! Teaching Statistics Using Informational Statistical Inference

    ERIC Educational Resources Information Center

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  9. Statistics Poker: Reinforcing Basic Statistical Concepts

    ERIC Educational Resources Information Center

    Leech, Nancy L.

    2008-01-01

    Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…

  10. Statistics Poker: Reinforcing Basic Statistical Concepts

    ERIC Educational Resources Information Center

    Leech, Nancy L.

    2008-01-01

    Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing

  11. Neurological imaging: statistics behind the pictures

    PubMed Central

    Dinov, Ivo D

    2011-01-01

    Neurological imaging represents a powerful paradigm for investigation of brain structure, physiology and function across different scales. The diverse phenotypes and significant normal and pathological brain variability demand reliable and efficient statistical methodologies to model, analyze and interpret raw neurological images and derived geometric information from these images. The validity, reproducibility and power of any statistical brain map require appropriate inference on large cohorts, significant community validation, and multidisciplinary collaborations between physicians, engineers and statisticians. PMID:22180753

  12. Applications of Statistical Tests in Hand Surgery

    PubMed Central

    Song, Jae W.; Haas, Ann; Chung, Kevin C.

    2015-01-01

    During the nineteenth century, with the emergence of public health as a goal to improve hygiene and conditions of the poor, statistics established itself as a distinct scientific field important for critically interpreting studies of public health concerns. During the twentieth century, statistics began to evolve mathematically and methodologically with hypothesis testing and experimental design. Today, the design of medical experiments centers around clinical trials and observational studies, and with the use of statistics, the collected data are summarized, weighed, and presented to direct both physicians and the public towards Evidence-Based Medicine. Having a basic understanding of statistics is mandatory in evaluating the validity of published literature and applying it to patient care. In this review, we aim to apply a practical approach in discussing basic statistical tests by providing a guide to choosing the correct statistical test along with examples relevant to hand surgery research. PMID:19969193

  13. Easy Statistics. Supervising: Technical Aspects of Supervision. The Choice Series #34. A Self Learning Opportunity.

    ERIC Educational Resources Information Center

    Carlisle, Ysanne

    This learning unit on easy statistics is one in the Choice Series, a self-learning development program for supervisors. Purpose stated for the approximately eight-hour-long unit is to enable the supervisor to use statistics to improve his/her decision-making ability, interpret statistics and form judgments on the interpretations other people make,…

  14. Pornography and rape: theory and practice? Evidence from crime data in four countries where pornography is easily available.

    PubMed

    Kutchinsky, B

    1991-01-01

    We have looked at the empirical evidence of the well-known feminist dictum: "pornography is the theory--rape is the practice" (Morgan, 1980). While earlier research, notably that generated by the U.S. Commission on Obscenity and Pornography (1970) had found no evidence of a causal link between pornography and rape, a new generation of behavioral scientists have, for more than a decade, made considerable effort to prove such a connection, especially as far as "aggressive pornography" is concerned. The first part of the article examines and discusses the findings of this new research. A number of laboratory experiments have been conducted, much akin to the types of experiments developed by researchers of the effects of nonsexual media violence. As in the latter, a certain degree of increased "aggressiveness" has been found under certain circumstances, but to extrapolate from such laboratory effects to the commission of rape in real life is dubious. Studies of rapists' and nonrapists' immediate sexual reactions to presentations of pornography showed generally greater arousal to non-violent scenes, and no difference can be found in this regard between convicted rapists, nonsexual criminals and noncriminal males. In the second part of the paper an attempt was made to study the necessary precondition for a substantial causal relationship between the availability of pornography, including aggressive pornography, and rape--namely, that obviously increased availability of such material was followed by an increase in cases of reported rape. The development of rape and attempted rape during the period 1964-1984 was studied in four countries: the U.S.A., Denmark, Sweden and West Germany. In all four countries there is clear and undisputed evidence that during this period the availability of various forms of pictorial pornography including violent/dominant varieties (in the form of picture magazines, and films/videos used at home or shown in arcades or cinemas) has developed from extreme scarcity to relative abundance. If (violent) pornography causes rape, this exceptional development in the availability of (violent) pornography should definitely somehow influence the rape statistics. Since, however, the rape figures could not simply be expected to remain steady during the period in question (when it is well known that most other crimes increased considerably), the development of rape rates was compared with that of non-sexual violent offences and nonviolent sexual offences (in so far as available statistics permitted). The results showed that in none of the countries did rape increase more than nonsexual violent crimes. This finding in itself would seem sufficient to discard the hypothesis that pornography causes rape.(ABSTRACT TRUNCATED AT 400 WORDS) PMID:2032762

  15. Fundamentals of interpretation in echocardiography

    SciTech Connect

    Harrigan, P.; Lee, R.M.

    1985-01-01

    This illustrated book provides familiarity with the many clinical, physical, and electronic factors that bear on echocardiographic interpretation. Physical and clinical principles are integrated with considerations of anatomy and physiology to address interpretive problems. This approach yields, for example, sections on the physics and electronics of M-mode, cross sectional, and Doppler systems which are informal, full of echocardiagrams, virtually devoid of mathematics, and rigorously related to common issues faced by echocardiograph interpreters.

  16. Components of Simultaneous Interpreting: Comparing Interpreting with Shadowing and Paraphrasing

    ERIC Educational Resources Information Center

    Christoffels, Ingrid K.; de Groot, Annette M. B.

    2004-01-01

    Simultaneous interpreting is a complex task where the interpreter is routinely involved in comprehending, translating and producing language at the same time. This study assessed two components that are likely to be major sources of complexity in SI: The simultaneity of comprehension and production, and transformation of the input. Furthermore,…

  17. EXPERIMENTAL DESIGN: STATISTICAL CONSIDERATIONS AND ANALYSIS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this book chapter, information on how field experiments in invertebrate pathology are designed and the data collected, analyzed, and interpreted is presented. The practical and statistical issues that need to be considered and the rationale and assumptions behind different designs or procedures ...

  18. The Medical Interpreter Training Project.

    ERIC Educational Resources Information Center

    Avery, Maria-Paz Beltran

    The Medical Interpreter Training Project, created as a collaborative effort of Northern Essex Community College (Massachusetts), private businesses, and medical care providers in Massachusetts, developed a 28-credit, competency-based certificate program to prepare bilingual adults to work as medical interpreters in a range of health care settings.…

  19. Interpreting Recoil for Undergraduate Students

    ERIC Educational Resources Information Center

    Elsayed, Tarek A.

    2012-01-01

    The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is…

  20. Interpreting Recoil for Undergraduate Students

    ERIC Educational Resources Information Center

    Elsayed, Tarek A.

    2012-01-01

    The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is

  1. Curriculum Guide for Interpreter Training.

    ERIC Educational Resources Information Center

    Sternberg, Martin L. A.; And Others

    Presented is a curriculum guide for the training of interpreters for the deaf consisting of 15 sections to be used as individual units or comprising a two part, 1 year course. The full course uses the text, Interpreting for Deaf People, as a guide and includes laboratory and practicum experiences. Curriculum guidelines include specific aims such…

  2. Remote sensing and image interpretation

    NASA Technical Reports Server (NTRS)

    Lillesand, T. M.; Kiefer, R. W. (Principal Investigator)

    1979-01-01

    A textbook prepared primarily for use in introductory courses in remote sensing is presented. Topics covered include concepts and foundations of remote sensing; elements of photographic systems; introduction to airphoto interpretation; airphoto interpretation for terrain evaluation; photogrammetry; radiometric characteristics of aerial photographs; aerial thermography; multispectral scanning and spectral pattern recognition; microwave sensing; and remote sensing from space.

  3. Semantic Interpretation in Generative Grammar.

    ERIC Educational Resources Information Center

    Jackendoff, Ray S.

    The author finds Katz and Postal's 1964 generative semantic theories concerning the organization of grammar incorrect and proposes an interpretive approach to semantics in which syntactic structures are given interpretations by an autonomous semantic component. The research reported leads the author to describe a generative grammar consisting of…

  4. Museum Docents' Understanding of Interpretation

    ERIC Educational Resources Information Center

    Neill, Amanda C.

    2010-01-01

    The purpose of this qualitative research study was to explore docents' perceptions of their interpretive role in art museums and determine how those perceptions shape docents' practice. The objective was to better understand how docents conceive of their role and what shapes the interpretation they give on tours to the public. The conceptual…

  5. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  6. Geological interpretation of potential fields

    NASA Astrophysics Data System (ADS)

    Starostenko, V. I.

    This volume contains papers from the Third All-Union School-Seminar on the Geological Interpretation of Gravitational and Magnetic Fields (Yalta, December 1980). Particular consideration is given to such topics as a method for constructing density models of the tectonosphere of platform and active regions; the interpretation of the gravitational field of the basic structures of the world ocean; the current status of gravitational surveying; and an algorithm for the regional interpretation of gravimetry data. Also considered are the inverse problem of magnetic surveying; the role of viscous magnetization in the formation of magnetic anomalies of the continental crust; calculation of mechanical stresses in the lithosphere on the basis of gravitational data; the deep structure of the Siberian platform as interpreted on the basis of gravimeter and magnetometer data; the equivalence of density models of deep structures; and a systems approach to the interpretation of gravimetry data. No individual items are abstracted in this volume

  7. ADHD Rating Scale-IV: Checklists, Norms, and Clinical Interpretation

    ERIC Educational Resources Information Center

    Pappas, Danielle

    2006-01-01

    This article reviews the "ADHD Rating Scale-IV: Checklist, norms, and clinical interpretation," is a norm-referenced checklist that measures the symptoms of attention deficit/hyperactivity disorder (ADHD) according to the diagnostic criteria of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV; American Psychiatric Association,…

  8. Defining and Interpreting Suppressor Effects: Advantages and Limitations.

    ERIC Educational Resources Information Center

    Lancaster, Brian P.

    Suppressor effects are considered one of the most elusive dynamics in the interpretation of statistical data. A suppressor variable has been defined as a predictor that has a zero correlation with the dependent variable while still, paradoxically, contributing to the predictive validity of the test battery (P. Horst, 1941). This paper explores the…

  9. Developing and Assessing Students' Abilities To Interpret Research.

    ERIC Educational Resources Information Center

    Forsyth, G. Alfred; And Others

    A recent conference on statistics education recommended that more emphasis be placed on the interpretation of research (IOR). Ways for developing and assessing IOR and providing a systematic framework for creating and selecting instructional materials for the independent assessment of specific IOR concepts are the focus of this paper. The…

  10. USER'S GUIDE: CHROMOSOMAL ABERRATION DATA ANALYSIS AND INTERPRETATION SYSTEM

    EPA Science Inventory

    This user's manual provides guidance to researchers and the regulatory community for interacting with a data analysis and statistical interpretation system, designated as CA. A is dedicated to the in vivo chromosome aberration assay, a routinely used genetic toxicology assay for ...

  11. Philosophical perspectives on quantum chaos: Models and interpretations

    NASA Astrophysics Data System (ADS)

    Bokulich, Alisa Nicole

    2001-09-01

    The problem of quantum chaos is a special case of the larger problem of understanding how the classical world emerges from quantum mechanics. While we have learned that chaos is pervasive in classical systems, it appears to be almost entirely absent in quantum systems. The aim of this dissertation is to determine what implications the interpretation of quantum mechanics has for attempts to explain the emergence of classical chaos. There are three interpretations of quantum mechanics that have set out programs for solving the problem of quantum chaos: the standard interpretation, the statistical interpretation, and the deBroglie-Bohm causal interpretation. One of the main conclusions of this dissertation is that an interpretation alone is insufficient for solving the problem of quantum chaos and that the phenomenon of decoherence must be taken into account. Although a completely satisfactory solution of the problem of quantum chaos is still outstanding, I argue that the deBroglie-Bohm interpretation with the help of decoherence outlines the most promising research program to pursue. In addition to making a contribution to the debate in the philosophy of physics concerning the interpretation of quantum mechanics, this dissertation reveals two important methodological lessons for the philosophy of science. First, issues of reductionism and intertheoretic relations cannot be divorced from questions concerning the interpretation of the theories involved. Not only is the exploration of intertheoretic relations a central part of the articulation and interpretation of an individual theory, but the very terms used to discuss intertheoretic relations, such as `state' and `classical limit', are themselves defined by particular interpretations of the theory. The second lesson that emerges is that, when it comes to characterizing the relationship between classical chaos and quantum mechanics, the traditional approaches to intertheoretic relations, namely reductionism and theoretical pluralism, are inadequate. The fruitful ways in which models have been used in quantum chaos research point to the need for a new framework for addressing intertheoretic relations that focuses on models rather than laws.

  12. a Contextualist Interpretation of Mathematics

    NASA Astrophysics Data System (ADS)

    Liu, Jie

    2014-03-01

    The nature of mathematics has been the subject of heated debate among mathematicians and philosophers throughout the ages. The realist and anti-realist positions have had longstanding debate over this problem, but some of the most important recent development has focused on the interpretations; each of the above positions has its own interpretation of the nature of mathematics. I argue in this paper a contextualist interpretation of mathematics, it elucidates the essential features of mathematical context. That is, being integral and having concrete structure, mathematical context is a recontextualizational process with determinate boundary.

  13. Interpreting Results from Multiscore Batteries.

    ERIC Educational Resources Information Center

    Anastasi, Anne

    1985-01-01

    Describes the role of information on score reliabilities, significance of score differences, intercorrelations of scores, and differential validity of score patterns on the interpretation of results from multiscore batteries. (Author)

  14. Car Troubles: An Interpretive Approach.

    ERIC Educational Resources Information Center

    Dawson, Leslie

    1995-01-01

    The growing amount of U.S. surface area being paved increases interpretive opportunities for teaching about the environmental impacts of automobiles. Provides methods and suggestions for educating high school students. Provides several computer graphics. (LZ)

  15. ENVIRONMENTAL PHOTOGRAPHIC INTERPRETATION CENTER (EPIC)

    EPA Science Inventory

    The Environmental Sciences Division (ESD) in the National Exposure Research Laboratory (NERL) of the Office of Research and Development provides remote sensing technical support including aerial photograph acquisition and interpretation to the EPA Program Offices, ORD Laboratorie...

  16. Guidelines simplify well test interpretation

    SciTech Connect

    Ehlig-Economides, C.A.; Hegeman, P. ); Vik, S. )

    1994-07-18

    With a few simple guidelines, industry professionals, especially those who are not well-testing experts, can know more about well-test interpretation, and thus make more appropriate decisions for well tests. Today's well tests frequently provide much more than permeability, skin, and extrapolated pressure. Most managers, geoscientists, and petroleum engineers rely on specialists to interpret pressure-transient data from well tests. At times, however, valuable test results are overlooked when modern analysis techniques are not used to interpret the acquired data. The first in a series of three articles addresses what to expect from a well test interpretation. The second part will show how to design a test, and manage well site data acquisition to ensure optimum results. The concluding part will illustrate these concepts in two successful cases.

  17. Personalized Interpretation and Experience Enhancement.

    ERIC Educational Resources Information Center

    West, Robert Mac

    2001-01-01

    Presents a discussion on the interpretations of museums and zoos. Introduces the applications of living history, museum theater and explains the terms interactors, explainers, and curators; keepers; and technicians. Lists the locations having the explained applications. Includes 29 references. (YDS)

  18. QUANTIFICATION AND INTERPRETATION OF TOTAL PETROLEUM HYDROCARBONS IN SEDIMENT SAMPLES BY A GC/MS METHOD AND COMPARISON WITH EPA 418.1 AND A RAPID FIELD METHOD

    EPA Science Inventory

    ABSTRACT: Total Petroleum hydrocarbons (TPH) as a lumped parameter can be easily and rapidly measured or monitored. Despite interpretational problems, it has become an accepted regulatory benchmark used widely to evaluate the extent of petroleum product contamination. Three cu...

  19. Interpreter services in emergency medicine.

    PubMed

    Chan, Yu-Feng; Alagappan, Kumar; Rella, Joseph; Bentley, Suzanne; Soto-Greene, Marie; Martin, Marcus

    2010-02-01

    Emergency physicians are routinely confronted with problems associated with language barriers. It is important for emergency health care providers and the health system to strive for cultural competency when communicating with members of an increasingly diverse society. Possible solutions that can be implemented include appropriate staffing, use of new technology, and efforts to develop new kinds of ties to the community served. Linguistically specific solutions include professional interpretation, telephone interpretation, the use of multilingual staff members, the use of ad hoc interpreters, and, more recently, the use of mobile computer technology at the bedside. Each of these methods carries a specific set of advantages and disadvantages. Although professionally trained medical interpreters offer improved communication, improved patient satisfaction, and overall cost savings, they are often underutilized due to their perceived inefficiency and the inconclusive results of their effect on patient care outcomes. Ultimately, the best solution for each emergency department will vary depending on the population served and available resources. Access to the multiple interpretation options outlined above and solid support and commitment from hospital institutions are necessary to provide proper and culturally competent care for patients. Appropriate communications inclusive of interpreter services are essential for culturally and linguistically competent provider/health systems and overall improved patient care and satisfaction. PMID:18571358

  20. Students' Interpretation of a Function Associated with a Real-Life Problem from Its Graph

    ERIC Educational Resources Information Center

    Mahir, Nevin

    2010-01-01

    The properties of a function such as limit, continuity, derivative, growth, or concavity can be determined more easily from its graph than by doing any algebraic operation. For this reason, it is important for students of mathematics to interpret some of the properties of a function from its graph. In this study, we investigated the competence of…

  1. Overweight and Obesity Statistics

    MedlinePlus

    ... View the full list of resources ​​. Overweight and Obesity Statistics Page Content About Overweight and Obesity Prevalence ... Activity Statistics Clinical Trials Resources About Overweight and Obesity This publication describes the prevalence of overweight and ...

  2. Statistical analysis of planetary surfaces

    NASA Astrophysics Data System (ADS)

    Schmidt, Frederic; Landais, Francois; Lovejoy, Shaun

    2015-04-01

    In the last decades, a huge amount of topographic data has been obtained by several techniques (laser and radar altimetry, DTM…) for different bodies in the solar system, including Earth, Mars, the Moon etc.. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeter to thousands of kilometers. This complexity seems to prohibit global descriptions or global topography models. Nevertheless, this topographic complexity is well-known to exhibit scaling laws that establish a similarity between scales and permit simpler descriptions and models. Indeed, efficient simulations can be made using the statistical properties of scaling fields (fractals). But realistic simulations of global topographic fields must be multi (not mono) scaling behaviour, reflecting the extreme variability and intermittency observed in real fields that can not be generated by simple scaling models. A multiscaling theory has been developed in order to model high variability and intermittency. This theory is a good statistical candidate to model the topography field with a limited number of parameters (called the multifractal parameters). In our study, we show that statistical properties of the Martian topography is accurately reproduced by this model, leading to new interpretation of geomorphological processes.

  3. Minnesota Health Statistics 1988.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Health, St. Paul.

    This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…

  4. Ethics in Statistics

    ERIC Educational Resources Information Center

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  5. On suicide statistics.

    PubMed

    Thorslund, J; Misfeldt, J

    1989-07-01

    The classical methodological problem of suicidology is reliability of official statistics. In this article, some recent contributions to the debate, particularly concerning the increased problem of suicide among Inuit, are reviewed. Secondly the suicide statistics of Greenland are analyzed, with the conclusion that the official statistics, as published by the Danish Board of Health, are generally reliable concerning Greenland. PMID:2789569

  6. Avoiding Statistical Mistakes

    ERIC Educational Resources Information Center

    Strasser, Nora

    2007-01-01

    Avoiding statistical mistakes is important for educators at all levels. Basic concepts will help you to avoid making mistakes using statistics and to look at data with a critical eye. Statistical data is used at educational institutions for many purposes. It can be used to support budget requests, changes in educational philosophy, changes to…

  7. Faculty Salary Equity Cases: Combining Statistics with the Law

    ERIC Educational Resources Information Center

    Luna, Andrew L.

    2006-01-01

    Researchers have used many statistical models to determine whether an institution's faculty pay structure is equitable, with varying degrees of success. Little attention, however, has been given to court interpretations of statistical significance or to what variables courts have acknowledged should be used in an equity model. This article…

  8. ALISE Library and Information Science Education Statistical Report, 1999.

    ERIC Educational Resources Information Center

    Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.

    This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by

  9. The Effect Size Statistic: Overview of Various Choices.

    ERIC Educational Resources Information Center

    Mahadevan, Lakshmi

    Over the years, methodologists have been recommending that researchers use magnitude of effect estimates in result interpretation to highlight the distinction between statistical and practical significance (cf. R. Kirk, 1996). A magnitude of effect statistic (i.e., effect size) tells to what degree the dependent variable can be controlled,…

  10. What Is the next Trend in Usage Statistics in Libraries?

    ERIC Educational Resources Information Center

    King, Douglas

    2009-01-01

    In answering the question "What is the next trend in usage statistics in libraries?" an eclectic group of respondents has presented an assortment of possibilities, suggestions, complaints and, of course, questions of their own. Undoubtedly, usage statistics collection, interpretation, and application are areas of growth and increasing complexity…

  11. Evaluation of the TV Series "Statistics" (SABC-ERTV1).

    ERIC Educational Resources Information Center

    Stupart, J. D. C.; Duby, Aliza

    A summative evaluation of the effectiveness of the educational television series, "Statistics," that aired on South African television is presented. The two episodes chosen from the six-episode series covered pie charts, pictograms, and pictographs (episode 1); and point-of-view interpretations of statistics (episode 4). The evaluation was…

  12. Collaborative Teachback with a Statistical Cognitive Tool: A Formative Evaluation.

    ERIC Educational Resources Information Center

    Bain, John D.; Mavor, Ken

    A learning environment is described in which students collaborate in small groups to develop screen movies in which they use a statistical cognitive tool to interpret published research and to demonstrate their understanding of least squares statistical concepts. Evaluation data are reported, which indicate that, although some groups thrive in…

  13. The Role of Statistical Significance Testing in Educational Research.

    ERIC Educational Resources Information Center

    McLean, James E.; Ernest, James M.

    1998-01-01

    Although statistical significance testing as the sole basis for result interpretation is a flawed practice, significance tests can be useful as one of three criteria that must be demonstrated to establish a position empirically. Statistical significance testing provides evidence that an event did not happen by chance but gives no evidence of the…

  14. The Power of Teaching Activities: Statistical and Methodological Recommendations

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Foels, Rob

    2009-01-01

    Researchers rarely mention statistical power in "Teaching of Psychology" teaching activity studies. Insufficiently powered tests promote uncertainty in the decision to accept or reject the tested null hypothesis and influence the interpretation of results. We analyzed the a priori power of statistical tests from 197 teaching activity effectiveness…

  15. Securing wide appreciation of health statistics

    PubMed Central

    Pyrrait, A. M. DO Amaral; Aubenque, M. J.; Benjamin, B.; DE Groot, Meindert J. W.; Kohn, R.

    1954-01-01

    All the authors are agreed on the need for a certain publicizing of health statistics, but do Amaral Pyrrait points out that the medical profession prefers to convince itself rather than to be convinced. While there is great utility in articles and reviews in the professional press (especially for paramedical personnel) Aubenque, de Groot, and Kohn show how appreciation can effectively be secured by making statistics more easily understandable to the non-expert by, for instance, including readable commentaries in official publications, simplifying charts and tables, and preparing simple manuals on statistical methods. Aubenque and Kohn also stress the importance of linking health statistics to other economic and social information. Benjamin suggests that the principles of market research could to advantage be applied to health statistics to determine the precise needs of the “consumers”. At the same time, Aubenque points out that the value of the ultimate results must be clear to those who provide the data; for this, Kohn suggests that the enumerators must know exactly what is wanted and why. There is general agreement that some explanation of statistical methods and their uses should be given in the curricula of medical schools and that lectures and postgraduate courses should be arranged for practising physicians. PMID:13199668

  16. Intelligent Collection Environment for an Interpretation System

    SciTech Connect

    Maurer, W J

    2001-07-19

    An Intelligent Collection Environment for a data interpretation system is described. The environment accepts two inputs: A data model and a number between 0.0 and 1.0. The data model is as simple as a single word or as complex as a multi-level/multidimensional model. The number between 0.0 and 1.0 is a control knob to indicate the user's desire to allow loose matching of the data (things are ambiguous and unknown) versus strict matching of the data (things are precise and known). The environment produces a set of possible interpretations, a set of requirements to further strengthen or to differentiate a particular subset of the possible interpretation from the others, a set of inconsistencies, and a logic map that graphically shows the lines of reasoning used to derive the above output. The environment is comprised of a knowledge editor, model explorer, expertise server, and the World Wide Web. The Knowledge Editor is used by a subject matter expert to define Linguistic Types, Term Sets, detailed explanations, and dynamically created URI's, and to create rule bases using a straight forward hyper matrix representation. The Model Explorer allows rapid construction and browsing of multi-level models. A multi-level model is a model whose elements may also be models themselves. The Expertise Server is an inference engine used to interpret the data submitted. It incorporates a semantic network knowledge representation, an assumption based truth maintenance system, and a fuzzy logic calculus. It can be extended by employing any classifier (e.g. statistical/neural networks) of complex data types. The World Wide Web is an unstructured data space accessed by the URI's supplied as part of the output of the environment. By recognizing the input data model as a query, the environment serves as a deductive search engine. Applications include (but are not limited to) interpretation of geophysical phenomena, a navigation aid for very large web sites, monitoring of computer or sensor networks, customer support, trouble shooting, and searching complex digital libraries (e.g. genome libraries).

  17. An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics

    ERIC Educational Resources Information Center

    Ellis, Frank B.; Ellis, David C.

    2008-01-01

    Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic

  18. An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics

    ERIC Educational Resources Information Center

    Ellis, Frank B.; Ellis, David C.

    2008-01-01

    Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…

  19. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    PubMed Central

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory causal indicators are controversial and little-understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning intended by a researcher. This article questions the validity of evidence used to claim that causal indicators are inherently susceptible to interpretational confounding. Further, a simulation study demonstrates that causal indicator coefficients are stable across correctly-specified models. Determining the suitability of causal indicators has implications for the way we conceptualize measurement and build and evaluate measurement models. PMID:25530730

  20. Water isotope systematics: Improving our palaeoclimate interpretations

    NASA Astrophysics Data System (ADS)

    Jones, M. D.; Dee, S.; Anderson, L.; Baker, A.; Bowen, G.; Noone, D. C.

    2016-01-01

    The stable isotopes of oxygen and hydrogen, measured in a variety of archives, are widely used proxies in Quaternary Science. Understanding the processes that control δ18O change have long been a focus of research (e.g. Shackleton and Opdyke, 1973; Talbot, 1990; Leng, 2006). Both the dynamics of water isotope cycling and the appropriate interpretation of geological water-isotope proxy time series remain subjects of active research and debate. It is clear that achieving a complete understanding of the isotope systematics for any given archive type, and ideally each individual archive, is vital if these palaeo-data are to be used to their full potential, including comparison with climate model experiments of the past. Combining information from modern monitoring and process studies, climate models, and proxy data is crucial for improving our statistical constraints on reconstructions of past climate variability.

  1. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science. PMID:27231259

  2. Discovery of novel non-peptidic beta-alanine piperazine amide derivatives and their optimization to achiral, easily accessible, potent and selective somatostatin sst1 receptor antagonists.

    PubMed

    Troxler, Thomas; Hurth, Konstanze; Mattes, Henri; Prashad, Mahavir; Schoeffter, Philippe; Langenegger, Daniel; Enz, Albert; Hoyer, Daniel

    2009-03-01

    Structural simplification of the core moieties of obeline and ergoline somatostatin sst(1) receptor antagonists, followed by systematic optimization, led to the identification of novel, highly potent and selective sst(1) receptor antagonists. These achiral, non-peptidic compounds are easily prepared and show promising PK properties in rodents. PMID:19208473

  3. A Road More Easily Traveled

    ERIC Educational Resources Information Center

    Stanly, Pat

    2009-01-01

    Rough patches occur at both ends of the education pipeline, as students enter community colleges and move on to work or enrollment in four-year institutions. Career pathways--sequences of coherent, articulated, and rigorous career and academic courses that lead to an industry-recognized certificate or a college degree--are a promising approach to

  4. A Road More Easily Traveled

    ERIC Educational Resources Information Center

    Stanly, Pat

    2009-01-01

    Rough patches occur at both ends of the education pipeline, as students enter community colleges and move on to work or enrollment in four-year institutions. Career pathways--sequences of coherent, articulated, and rigorous career and academic courses that lead to an industry-recognized certificate or a college degree--are a promising approach to…

  5. Differences Help Recognition: A Probabilistic Interpretation

    PubMed Central

    Deng, Yue; Zhao, Yanyu; Liu, Yebin; Dai, Qionghai

    2013-01-01

    This paper presents a computational model to address one prominent psychological behavior of human beings to recognize images. The basic pursuit of our method can be concluded as that differences among multiple images help visual recognition. Generally speaking, we propose a statistical framework to distinguish what kind of image features capture sufficient category information and what kind of image features are common ones shared in multiple classes. Mathematically, the whole formulation is subject to a generative probabilistic model. Meanwhile, a discriminative functionality is incorporated into the model to interpret the differences among all kinds of images. The whole Bayesian formulation is solved in an Expectation-Maximization paradigm. After finding those discriminative patterns among different images, we design an image categorization algorithm to interpret how these differences help visual recognition within the bag-of-feature framework. The proposed method is verified on a variety of image categorization tasks including outdoor scene images, indoor scene images as well as the airborne SAR images from different perspectives. PMID:23755104

  6. A Local Interpretation of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Lopez, Carlos

    2016-04-01

    A local interpretation of quantum mechanics is presented. Its main ingredients are: first, a label attached to one of the "virtual" paths in the path integral formalism, determining the output for measurement of position or momentum; second, a mathematical model for spin states, equivalent to the path integral formalism for point particles in space time, with the corresponding label. The mathematical machinery of orthodox quantum mechanics is maintained, in particular amplitudes of probability and Born's rule; therefore, Bell's type inequalities theorems do not apply. It is shown that statistical correlations for pairs of particles with entangled spins have a description completely equivalent to the two slit experiment, that is, interference (wave like behaviour) instead of non locality gives account of the process. The interpretation is grounded in the experimental evidence of a point like character of electrons, and in the hypothetical existence of a wave like, the de Broglie, companion system. A correspondence between the extended Hilbert spaces of hidden physical states and the orthodox quantum mechanical Hilbert space shows the mathematical equivalence of both theories. Paradoxical behaviour with respect to the action reaction principle is analysed, and an experimental set up, modified two slit experiment, proposed to look for the companion system.

  7. Learning Interpretable SVMs for Biological Sequence Classification

    PubMed Central

    Rätsch, Gunnar; Sonnenburg, Sören; Schäfer, Christin

    2006-01-01

    Background Support Vector Machines (SVMs) – using a variety of string kernels – have been successfully applied to biological sequence classification problems. While SVMs achieve high classification accuracy they lack interpretability. In many applications, it does not suffice that an algorithm just detects a biological signal in the sequence, but it should also provide means to interpret its solution in order to gain biological insight. Results We propose novel and efficient algorithms for solving the so-called Support Vector Multiple Kernel Learning problem. The developed techniques can be used to understand the obtained support vector decision function in order to extract biologically relevant knowledge about the sequence analysis problem at hand. We apply the proposed methods to the task of acceptor splice site prediction and to the problem of recognizing alternatively spliced exons. Our algorithms compute sparse weightings of substring locations, highlighting which parts of the sequence are important for discrimination. Conclusion The proposed method is able to deal with thousands of examples while combining hundreds of kernels within reasonable time, and reliably identifies a few statistically significant positions. PMID:16723012

  8. Default Sarcastic Interpretations: On the Priority of Nonsalient Interpretations

    ERIC Educational Resources Information Center

    Giora, Rachel; Drucker, Ari; Fein, Ofer; Mendelson, Itamar

    2015-01-01

    Findings from five experiments support the view that negation generates sarcastic utterance-interpretations by default. When presented in isolation, novel negative constructions ("Punctuality is not his forte," "Thoroughness is not her most distinctive feature"), free of semantic anomaly or internal incongruity, were…

  9. SCIENCE INTERPRETIVE PROGRAM--SPERMACETI COVE INTERPRETIVE CENTER.

    ERIC Educational Resources Information Center

    COLE, RICHARD C.

    DESCRIBED IS THE OUTDOOR EDUCATION PROGRAM FOR THE MIDDLETOWN, NEW JERSEY ELEMENTARY SCHOOLS AT THE SPERMACETI COVE INTERPRETIVE CENTER IN SANDY HOOK STATE PARK. THE PROGRAM IS FUNDED UNDER PL89-10 OF THE ELEMENTARY AND SECONDARY EDUCATION ACT (ESEA). PHASE 1 (MARCH, 1966-JUNE, 1966) INVOLVED THE SELECTION OF NINE PUBLIC AND THREE PAROCHIAL FOURTH…

  10. The Interpretive Approach to Religious Education: Challenging Thompson's Interpretation

    ERIC Educational Resources Information Center

    Jackson, Robert

    2012-01-01

    In a recent book chapter, Matthew Thompson makes some criticisms of my work, including the interpretive approach to religious education and the research and activity of Warwick Religions and Education Research Unit. Against the background of a discussion of religious education in the public sphere, my response challenges Thompson's account,…

  11. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning

  12. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning…

  13. The Interpretive Approach to Religious Education: Challenging Thompson's Interpretation

    ERIC Educational Resources Information Center

    Jackson, Robert

    2012-01-01

    In a recent book chapter, Matthew Thompson makes some criticisms of my work, including the interpretive approach to religious education and the research and activity of Warwick Religions and Education Research Unit. Against the background of a discussion of religious education in the public sphere, my response challenges Thompson's account,

  14. Teaching Business Statistics with Real Data to Undergraduates and the Use of Technology in the Class Room

    ERIC Educational Resources Information Center

    Singamsetti, Rao

    2007-01-01

    In this paper an attempt is made to highlight some issues of interpretation of statistical concepts and interpretation of results as taught in undergraduate Business statistics courses. The use of modern technology in the class room is shown to have increased the efficiency and the ease of learning and teaching in statistics. The importance of…

  15. Measuring statistical heterogeneity: The Pietra index

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo I.; Sokolov, Igor M.

    2010-01-01

    There are various ways of quantifying the statistical heterogeneity of a given probability law: Statistics uses variance - which measures the law’s dispersion around its mean; Physics and Information Theory use entropy - which measures the law’s randomness; Economics uses the Gini index - which measures the law’s egalitarianism. In this research we explore an alternative to the Gini index-the Pietra index-which is a counterpart of the Kolmogorov-Smirnov statistic. The Pietra index is shown to be a natural and elemental measure of statistical heterogeneity, which is especially useful in the case of asymmetric and skewed probability laws, and in the case of asymptotically Paretian laws with finite mean and infinite variance. Moreover, the Pietra index is shown to have immediate and fundamental interpretations within the following applications: renewal processes and continuous time random walks; infinite-server queueing systems and shot noise processes; financial derivatives. The interpretation of the Pietra index within the context of financial derivatives implies that derivative markets, in effect, use the Pietra index as their benchmark measure of statistical heterogeneity.

  16. Statistical dynamics of religion evolutions

    NASA Astrophysics Data System (ADS)

    Ausloos, M.; Petroni, F.

    2009-10-01

    A religion affiliation can be considered as a “degree of freedom” of an agent on the human genre network. A brief review is given on the state of the art in data analysis and modelization of religious “questions” in order to suggest and if possible initiate further research, after using a “statistical physics filter”. We present a discussion of the evolution of 18 so-called religions, as measured through their number of adherents between 1900 and 2000. Some emphasis is made on a few cases presenting a minimum or a maximum in the investigated time range-thereby suggesting a competitive ingredient to be considered, besides the well accepted “at birth” attachment effect. The importance of the “external field” is still stressed through an Avrami late stage crystal growth-like parameter. The observed features and some intuitive interpretations point to opinion based models with vector, rather than scalar, like agents.

  17. Tractography atlas-based spatial statistics: Statistical analysis of diffusion tensor image along fiber pathways.

    PubMed

    Wang, Defeng; Luo, Yishan; Mok, Vincent C T; Chu, Winnie C W; Shi, Lin

    2016-01-15

    The quantitative analysis of diffusion tensor image (DTI) data has attracted increasing attention in recent decades for studying white matter (WM) integrity and development. Among the current DTI analysis methods, tract-based spatial statistics (TBSS), as a pioneering approach for the voxelwise analysis of DTI data, has gained a lot of popularity due to its user-friendly framework. However, in recent years, the reliability and interpretability of TBSS have been challenged by several works, and several improvements over the original TBSS pipeline have been suggested. In this paper, we propose a new DTI statistical analysis method, named tractography atlas-based spatial statistics (TABSS). It doesn't rely on the accurate alignment of fractional anisotropy (FA) images for population analysis and gets rid of the skeletonization procedures of TBSS, which have been indicated as the major sources of error. Furthermore, TABSS improves the interpretability of results by directly reporting the resulting statistics on WM tracts, waiving the need of a WM atlas in the interpretation of the results. The feasibility of TABSS was evaluated in an example study to show age-related FA alternation pattern of healthy human brain. Through this preliminary study, it is validated that TABSS can provide detailed statistical results in a comprehensive and easy-to-understand way. PMID:26481677

  18. Calibrated Peer Review for Interpreting Linear Regression Parameters: Results from a Graduate Course

    ERIC Educational Resources Information Center

    Enders, Felicity B.; Jenkins, Sarah; Hoverman, Verna

    2010-01-01

    Biostatistics is traditionally a difficult subject for students to learn. While the mathematical aspects are challenging, it can also be demanding for students to learn the exact language to use to correctly interpret statistical results. In particular, correctly interpreting the parameters from linear regression is both a vital tool and a…

  19. The standard map: From Boltzmann-Gibbs statistics to Tsallis statistics

    NASA Astrophysics Data System (ADS)

    Tirnakli, Ugur; Borges, Ernesto P.

    2016-03-01

    As well known, Boltzmann-Gibbs statistics is the correct way of thermostatistically approaching ergodic systems. On the other hand, nontrivial ergodicity breakdown and strong correlations typically drag the system into out-of-equilibrium states where Boltzmann-Gibbs statistics fails. For a wide class of such systems, it has been shown in recent years that the correct approach is to use Tsallis statistics instead. Here we show how the dynamics of the paradigmatic conservative (area-preserving) stan-dard map exhibits, in an exceptionally clear manner, the crossing from one statistics to the other. Our results unambiguously illustrate the domains of validity of both Boltzmann-Gibbs and Tsallis statistical distributions. Since various important physical systems from particle confinement in magnetic traps to autoionization of molecular Rydberg states, through particle dynamics in accelerators and comet dynamics, can be reduced to the standard map, our results are expected to enlighten and enable an improved interpretation of diverse experimental and observational results.

  20. Is statistical significance always significant?

    PubMed

    Koretz, Ronald L

    2005-06-01

    One way in which we learn new information is to read the medical literature. Whether or not we do primary research, it is important to be able to read literature in a critical fashion. A seemingly simple concept in reading is to interpret p values. For most of us, if we find a p value that is <.05, we take the conclusion to heart and quote it at every opportunity. If the p value is >.05, we discard the paper and look elsewhere for useful information. Unfortunately, this is too simplistic an approach. The real utility of p values is to consider them within the context of the experiment being performed. Defects in study design can make an interpretation of a p value useless. One has to be wary of type I (seeing a "statistically significant" difference just because of chance) and type II (failing to see a difference that really exists) errors. Examples of the former are publication bias and the performance of multiple analyses; the latter refers to a trial that is too small to demonstrate the difference. Finding significant differences in surrogate or intermediate endpoints may not help us. We need to know if those endpoints reflect the behavior of clinical endpoints. Selectively citing significant differences and disregarding studies that do not find them is inappropriate. Small differences, even if they are statistically significant, may require too much resource expenditure to be clinically useful. This article explores these problems in depth and attempts to put p values in the context of studies. PMID:16207667

  1. Evaluation of Computer Simulated Baseline Statistics for Use in Item Bias Studies. [Revised].

    ERIC Educational Resources Information Center

    Rogers, H. Jane; Hambleton, Ronald K.

    Although item bias statistics are widely recommended for use in test development and test analysis work, problems arise in their interpretation. The purpose of the present research was to evaluate the validity of logistic test models and computer simulation methods for providing a frame of reference for item bias statistic interpretations.…

  2. 77 FR 32441 - Proposed Legal Interpretation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-01

    ... Federal Aviation Administration 14 CFR Part 121 Proposed Legal Interpretation AGENCY: Federal Aviation Administration (FAA). ACTION: Proposed interpretation. SUMMARY: The FAA is considering clarifying prior legal..., 2010, the FAA received a request for a legal interpretation from the Independent Pilots...

  3. Medicare program; Medicare prescription drug benefit; interpretation. Final rule; interpretation.

    PubMed

    2005-03-21

    This final rule modifies or clarifies our interpretations in several areas of the final rule titled "Medicare Prescription Drug Benefit" published in the Federal Register on January 28, 2005. First, it clarifies our interpretation of "entity", to respond to inquiries we received subsequent to the publication of the Prescription Drug Benefit (Part D) final rule on January 28, 2005. We were asked whether a joint enterprise could be considered an "entity" under section 1860D-12(a)(1) of the Social Security Act (the Act), for purposes of offering a prescription drug plan (PDP). Our interpretation is discussed in the Supplementary Information section of this final rule. Second, also subsequent to the publication of the Prescription Drug Benefit (Part D) final rule on January 28, 2005, we received inquiries from parties about our discussion of the actuarial equivalence standard and the manner in which an employee health plan sponsor could apply the aggregate net value test in the regulatory text of the final rule. Our interpretation is discussed in the "Provisions" section of this final rule. In addition, subsequent to publishing the August 3, 2004 proposed rule (69 FR 46684), we received comments on how the late enrollment penalty would be coordinated with the late enrollment penalty for Part B, and whether the one percent penalty would be sufficient to control for adverse selection. We clarify in the Provisions section of this final rule that the example given in the proposed rule, published on August 3, 2004, did not accord with the proposed or final regulatory language because it did not account for the fact that the base beneficiary premium increases on an annual basis. To remedy this error and in response to comments received on the proposed rule, we provide an interpretation that as the base beneficiary premium increases, the late enrollment penalty must also increase, and is in keeping with how the Part B penalty is calculated. Finally, we are providing clarifying language related to transitioning Part D enrollees from their prior drug coverage to their new Part D plan coverage. The Medicare Prescription Drug Benefit final rule will take effect on March 22, 2005. Our interpretations are deemed to be included in that final rule. PMID:15786588

  4. Statistical mechanics of community detection

    NASA Astrophysics Data System (ADS)

    Reichardt, Jörg; Bornholdt, Stefan

    2006-07-01

    Starting from a general ansatz, we show how community detection can be interpreted as finding the ground state of an infinite range spin glass. Our approach applies to weighted and directed networks alike. It contains the ad hoc introduced quality function from [J. Reichardt and S. Bornholdt, Phys. Rev. Lett. 93, 218701 (2004)] and the modularity Q as defined by Newman and Girvan [Phys. Rev. E 69, 026113 (2004)] as special cases. The community structure of the network is interpreted as the spin configuration that minimizes the energy of the spin glass with the spin states being the community indices. We elucidate the properties of the ground state configuration to give a concise definition of communities as cohesive subgroups in networks that is adaptive to the specific class of network under study. Further, we show how hierarchies and overlap in the community structure can be detected. Computationally efficient local update rules for optimization procedures to find the ground state are given. We show how the ansatz may be used to discover the community around a given node without detecting all communities in the full network and we give benchmarks for the performance of this extension. Finally, we give expectation values for the modularity of random graphs, which can be used in the assessment of statistical significance of community structure.

  5. Statistical mechanics of community detection.

    PubMed

    Reichardt, Jörg; Bornholdt, Stefan

    2006-07-01

    Starting from a general ansatz, we show how community detection can be interpreted as finding the ground state of an infinite range spin glass. Our approach applies to weighted and directed networks alike. It contains the ad hoc introduced quality function from [J. Reichardt and S. Bornholdt, Phys. Rev. Lett. 93, 218701 (2004)] and the modularity Q as defined by Newman and Girvan [Phys. Rev. E 69, 026113 (2004)] as special cases. The community structure of the network is interpreted as the spin configuration that minimizes the energy of the spin glass with the spin states being the community indices. We elucidate the properties of the ground state configuration to give a concise definition of communities as cohesive subgroups in networks that is adaptive to the specific class of network under study. Further, we show how hierarchies and overlap in the community structure can be detected. Computationally efficient local update rules for optimization procedures to find the ground state are given. We show how the ansatz may be used to discover the community around a given node without detecting all communities in the full network and we give benchmarks for the performance of this extension. Finally, we give expectation values for the modularity of random graphs, which can be used in the assessment of statistical significance of community structure. PMID:16907154

  6. Design Document. EKG Interpretation Program.

    ERIC Educational Resources Information Center

    Webb, Sandra M.

    This teaching plan is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in acquainting students with the basic skills needed to perform electrocardiographic (ECG or EKG) interpretations. The first part of the teaching plan contains a statement of purpose; audience recommendations; a flow chart detailing…

  7. Interpreting Data: The Hybrid Mind

    ERIC Educational Resources Information Center

    Heisterkamp, Kimberly; Talanquer, Vicente

    2015-01-01

    The central goal of this study was to characterize major patterns of reasoning exhibited by college chemistry students when analyzing and interpreting chemical data. Using a case study approach, we investigated how a representative student used chemical models to explain patterns in the data based on structure-property relationships. Our results

  8. Interpreter Training Program: Program Review.

    ERIC Educational Resources Information Center

    Massoud, LindaLee

    This report describes in detail the deaf interpreter training program offered at Mott Community College (Flint, Michigan). The program features field-based learning experiences, internships, team teaching, a field practicum, the goal of having students meet certification standards, and proficiency examinations. The program has special…

  9. [Attempt at multidimensional dream interpretation].

    PubMed

    Bengesser, G

    1995-01-01

    The recent developments of depthpsychology and neurophysiology makes a new orientation and interpretation of dreams necessary. Koella for example describes night-mares of persons taking Beta-blockers. Of course, the great ideas of Freud and Adler should be integrated, not abandoned. Most of all influences of biologic factors on quality of dreams should be discussed. PMID:7660671

  10. Interpreting chromosomal abnormalities using Prolog.

    PubMed

    Cooper, G; Friedman, J M

    1990-04-01

    This paper describes an expert system for interpreting the standard notation used to represent human chromosomal abnormalities, namely, the International System for Human Cytogenetic Nomenclature. Written in Prolog, this program is very powerful, easy to maintain, and portable. The system can be used as a front end to any database that employs cytogenetic notation, such as a patient registry. PMID:2185921

  11. Smartberries: Interpreting Erdrich's Love Medicine

    ERIC Educational Resources Information Center

    Treuer, David

    2005-01-01

    The structure of "Love Medicines" interpreted by Hertha D. Sweet Wong who claims that the book's "multiple narrators confound conventional Western expectations of an autonomous protagonist, a dominant narrative voice, and a consistently chronological narrative". "Love Medicine" is a brilliant use of the Western literary tactics that create the…

  12. Reference for radiographic film interpreters

    NASA Technical Reports Server (NTRS)

    Austin, D. L.

    1970-01-01

    Reference of X-ray film images provides examples of weld defects, film quality, stainless steel welded tubing, and acceptable weld conditions. A summary sheet details the discrepancies shown on the film strip. This reference aids in interpreting and evaluating radiographic film of weldments.

  13. Design Document. EKG Interpretation Program.

    ERIC Educational Resources Information Center

    Webb, Sandra M.

    This teaching plan is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in acquainting students with the basic skills needed to perform electrocardiographic (ECG or EKG) interpretations. The first part of the teaching plan contains a statement of purpose; audience recommendations; a flow chart detailing

  14. EKG Interpretation Program. Trainers Manual.

    ERIC Educational Resources Information Center

    Webb, Sandra M.

    This trainer's manual is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in teaching students how to make basic interpretations of their patients' electrocardiographic (EKG) strips. Included in the manual are pre- and posttests and instructional units dealing with the following topics: EKG indicators,

  15. Interpretive Reproduction in Children's Play

    ERIC Educational Resources Information Center

    Corsaro, William A.

    2012-01-01

    The author looks at children's play from the perspective of interpretive reproduction, emphasizing the way children create their own unique peer cultures, which he defines as a set of routines, artifacts, values, and concerns that children engage in with their playmates. The article focuses on two types of routines in the peer culture of preschool…

  16. Interpreting Data: The Hybrid Mind

    ERIC Educational Resources Information Center

    Heisterkamp, Kimberly; Talanquer, Vicente

    2015-01-01

    The central goal of this study was to characterize major patterns of reasoning exhibited by college chemistry students when analyzing and interpreting chemical data. Using a case study approach, we investigated how a representative student used chemical models to explain patterns in the data based on structure-property relationships. Our results…

  17. Studies in Interpretation. Volume II.

    ERIC Educational Resources Information Center

    Doyle, Esther M., Ed.; Floyd, Virginia Hastings, Ed.

    The purpose of this second book of 21 self-contained essays is the same as that of the first volume published in 1972: to bring together the scholarly theory and current research regarding oral interpretation. One third of the essays are centered on literature itself: prose fiction, poetry, and the drama. These essays discuss topics such as point…

  18. Clustering statistics in cosmology

    NASA Astrophysics Data System (ADS)

    Martinez, Vicent; Saar, Enn

    2002-12-01

    The main tools in cosmology for comparing theoretical models with the observations of the galaxy distribution are statistical. We will review the applications of spatial statistics to the description of the large-scale structure of the universe. Special topics discussed in this talk will be: description of the galaxy samples, selection effects and biases, correlation functions, Fourier analysis, nearest neighbor statistics, Minkowski functionals and structure statistics. Special attention will be devoted to scaling laws and the use of the lacunarity measures in the description of the cosmic texture.

  19. Statistical distribution sampling

    NASA Technical Reports Server (NTRS)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  20. Interpretation of FTIR spectra of polymers and Raman spectra of car paints by means of likelihood ratio approach supported by wavelet transform for reducing data dimensionality.

    PubMed

    Martyna, Agnieszka; Michalska, Aleksandra; Zadora, Grzegorz

    2015-05-01

    The problem of interpretation of common provenance of the samples within the infrared spectra database of polypropylene samples from car body parts and plastic containers as well as Raman spectra databases of blue solid and metallic automotive paints was under investigation. The research involved statistical tools such as likelihood ratio (LR) approach for expressing the evidential value of observed similarities and differences in the recorded spectra. Since the LR models can be easily proposed for databases described by a few variables, research focused on the problem of spectra dimensionality reduction characterised by more than a thousand variables. The objective of the studies was to combine the chemometric tools easily dealing with multidimensionality with an LR approach. The final variables used for LR models' construction were derived from the discrete wavelet transform (DWT) as a data dimensionality reduction technique supported by methods for variance analysis and corresponded with chemical information, i.e. typical absorption bands for polypropylene and peaks associated with pigments present in the car paints. Univariate and multivariate LR models were proposed, aiming at obtaining more information about the chemical structure of the samples. Their performance was controlled by estimating the levels of false positive and false negative answers and using the empirical cross entropy approach. The results for most of the LR models were satisfactory and enabled solving the stated comparison problems. The results prove that the variables generated from DWT preserve signal characteristic, being a sparse representation of the original signal by keeping its shape and relevant chemical information. PMID:25757825

  1. A Positive Interpretation of Apparent "Cumulative Deficit."

    ERIC Educational Resources Information Center

    Kamin, Leon J.

    1978-01-01

    Suggests an alternate, and optimistic, interpretation of developmental data that has been interpreted as indicating cumulative deficit in IQ among socioeconomically deprived Black children. (Author/SS)

  2. Admixture, Population Structure, and F-Statistics.

    PubMed

    Peter, Benjamin M

    2016-04-01

    Many questions about human genetic history can be addressed by examining the patterns of shared genetic variation between sets of populations. A useful methodological framework for this purpose isF-statistics that measure shared genetic drift between sets of two, three, and four populations and can be used to test simple and complex hypotheses about admixture between populations. This article provides context from phylogenetic and population genetic theory. I review howF-statistics can be interpreted as branch lengths or paths and derive new interpretations, using coalescent theory. I further show that the admixture tests can be interpreted as testing general properties of phylogenies, allowing extension of some ideas applications to arbitrary phylogenetic trees. The new results are used to investigate the behavior of the statistics under different models of population structure and show how population substructure complicates inference. The results lead to simplified estimators in many cases, and I recommend to replaceF3with the average number of pairwise differences for estimating population divergence. PMID:26857625

  3. Interpreting Sky-Averaged 21-cm Measurements

    NASA Astrophysics Data System (ADS)

    Mirocha, Jordan

    2015-01-01

    Within the first ~billion years after the Big Bang, the intergalactic medium (IGM) underwent a remarkable transformation, from a uniform sea of cold neutral hydrogen gas to a fully ionized, metal-enriched plasma. Three milestones during this epoch of reionization -- the emergence of the first stars, black holes (BHs), and full-fledged galaxies -- are expected to manifest themselves as extrema in sky-averaged ("global") measurements of the redshifted 21-cm background. However, interpreting these measurements will be complicated by the presence of strong foregrounds and non-trivialities in the radiative transfer (RT) modeling required to make robust predictions.I have developed numerical models that efficiently solve the frequency-dependent radiative transfer equation, which has led to two advances in studies of the global 21-cm signal. First, frequency-dependent solutions facilitate studies of how the global 21-cm signal may be used to constrain the detailed spectral properties of the first stars, BHs, and galaxies, rather than just the timing of their formation. And second, the speed of these calculations allows one to search vast expanses of a currently unconstrained parameter space, while simultaneously characterizing the degeneracies between parameters of interest. I find principally that (1) physical properties of the IGM, such as its temperature and ionization state, can be constrained robustly from observations of the global 21-cm signal without invoking models for the astrophysical sources themselves, (2) translating IGM properties to galaxy properties is challenging, in large part due to frequency-dependent effects. For instance, evolution in the characteristic spectrum of accreting BHs can modify the 21-cm absorption signal at levels accessible to first generation instruments, but could easily be confused with evolution in the X-ray luminosity star-formation rate relation. Finally, (3) the independent constraints most likely to aide in the interpretation of global 21-cm signal measurements are detections of Lyman Alpha Emitters at high redshifts and constraints on the midpoint of reionization, both of which are among the primary science objectives of ongoing or near-future experiments.

  4. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  5. Application Statistics 1987.

    ERIC Educational Resources Information Center

    Council of Ontario Universities, Toronto.

    Summary statistics on application and registration patterns of applicants wishing to pursue full-time study in first-year places in Ontario universities (for the fall of 1987) are given. Data on registrations were received indirectly from the universities as part of their annual submission of USIS/UAR enrollment data to Statistics Canada and MCU.…

  6. Explorations in Statistics: Regression

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2011-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive connection.…

  7. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  8. Explorations in Statistics: Regression

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2011-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive connection.

  9. DISABILITY STATISTICS CENTER

    EPA Science Inventory

    The purpose of the Disability Statistics Center is to produce and disseminate statistical information on disability and the status of people with disabilities in American society and to establish and monitor indicators of how conditions are changing over time to meet their health...

  10. Statistical Mapping by Computer.

    ERIC Educational Resources Information Center

    Utano, Jack J.

    The function of a statistical map is to provide readers with a visual impression of the data so that they may be able to identify any geographic characteristics of the displayed phenomena. The increasingly important role played by the computer in the production of statistical maps is manifested by the varied examples of computer maps in recent…

  11. Overhead Image Statistics

    SciTech Connect

    Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A

    2008-01-01

    Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.

  12. Explorations in Statistics: Correlation

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This sixth installment of "Explorations in Statistics" explores correlation, a familiar technique that estimates the magnitude of a straight-line relationship between two variables. Correlation is meaningful only when the…

  13. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…

  14. Multidimensional Visual Statistical Learning

    ERIC Educational Resources Information Center

    Turk-Browne, Nicholas B.; Isola, Phillip J.; Scholl, Brian J.; Treat, Teresa A.

    2008-01-01

    Recent studies of visual statistical learning (VSL) have demonstrated that statistical regularities in sequences of visual stimuli can be automatically extracted, even without intent or awareness. Despite much work on this topic, however, several fundamental questions remain about the nature of VSL. In particular, previous experiments have not

  15. Applied Statistics with SPSS

    ERIC Educational Resources Information Center

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  16. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses

  17. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a

  18. College Students' Interpretation of Research Reports on Group Differences: The Tall-Tale Effect

    ERIC Educational Resources Information Center

    Hogan, Thomas P.; Zaboski, Brian A.; Perry, Tiffany R.

    2015-01-01

    How does the student untrained in advanced statistics interpret results of research that reports a group difference? In two studies, statistically untrained college students were presented with abstracts or professional associations' reports and asked for estimates of scores obtained by the original participants in the studies. These estimates…

  19. Interpretation of fluorescence correlation spectra of biopolymer solutions.

    PubMed

    Phillies, George D J

    2016-05-01

    Fluorescence correlation spectroscopy (FCS) is regularly used to study diffusion in non-dilute "crowded" biopolymer solutions, including the interior of living cells. For fluorophores in dilute solution, the relationship between the FCS spectrum G(t) and the diffusion coefficient D is well-established. However, the dilute-solution relationship between G(t) and D has sometimes been used to interpret FCS spectra of fluorophores in non-dilute solutions. Unfortunately, the relationship used to interpret FCS spectra in dilute solutions relies on an assumption that is not always correct in non-dilute solutions. This paper obtains the correct form for interpreting FCS spectra of non-dilute solutions, writing G(t) in terms of the statistical properties of the fluorophore motions. Approaches for applying this form are discussed. © 2016 Wiley Periodicals, Inc. Biopolymers 105: 260-266, 2016. PMID:26756528

  20. Statistical controversies in clinical research: statistical significance-too much of a good thing ….

    PubMed

    Buyse, M; Hurvitz, S A; Andre, F; Jiang, Z; Burris, H A; Toi, M; Eiermann, W; Lindsay, M-A; Slamon, D

    2016-05-01

    The use and interpretation of P values is a matter of debate in applied research. We argue that P values are useful as a pragmatic guide to interpret the results of a clinical trial, not as a strict binary boundary that separates real treatment effects from lack thereof. We illustrate our point using the result of BOLERO-1, a randomized, double-blind trial evaluating the efficacy and safety of adding everolimus to trastuzumab and paclitaxel as first-line therapy for HER2+ advanced breast cancer. In this trial, the benefit of everolimus was seen only in the predefined subset of patients with hormone receptor-negative breast cancer at baseline (progression-free survival hazard ratio = 0.66, P = 0.0049). A strict interpretation of this finding, based on complex 'alpha splitting' rules to assess statistical significance, led to the conclusion that the benefit of everolimus was not statistically significant either overall or in the subset. We contend that this interpretation does not do justice to the data, and we argue that the benefit of everolimus in hormone receptor-negative breast cancer is both statistically compelling and clinically relevant. PMID:26861602

  1. Modelling Metamorphism by Abstract Interpretation

    NASA Astrophysics Data System (ADS)

    Dalla Preda, Mila; Giacobazzi, Roberto; Debray, Saumya; Coogan, Kevin; Townsend, Gregg M.

    Metamorphic malware apply semantics-preserving transformations to their own code in order to foil detection systems based on signature matching. In this paper we consider the problem of automatically extract metamorphic signatures from these malware. We introduce a semantics for self-modifying code, later called phase semantics, and prove its correctness by showing that it is an abstract interpretation of the standard trace semantics. Phase semantics precisely models the metamorphic code behavior by providing a set of traces of programs which correspond to the possible evolutions of the metamorphic code during execution. We show that metamorphic signatures can be automatically extracted by abstract interpretation of the phase semantics, and that regular metamorphism can be modelled as finite state automata abstraction of the phase semantics.

  2. Pediatric DXA: technique and interpretation

    PubMed Central

    Henwood, Maria J.

    2006-01-01

    This article reviews dual X-ray absorptiometry (DXA) technique and interpretation with emphasis on the considerations unique to pediatrics. Specifically, the use of DXA in children requires the radiologist to be a “clinical pathologist” monitoring the technical aspects of the DXA acquisition, a “statistician” knowledgeable in the concepts of Z-scores and least significant changes, and a “bone specialist” providing the referring clinician a meaningful context for the numeric result generated by DXA. The patient factors that most significantly influence bone mineral density are discussed and are reviewed with respect to available normative databases. The effects the growing skeleton has on the DXA result are also presented. Most important, the need for the radiologist to be actively involved in the technical and interpretive aspects of DXA is stressed. Finally, the diagnosis of osteoporosis should not be made on DXA results alone but should take into account other patient factors. PMID:16715219

  3. Phonological Interpretation into Preordered Algebras

    NASA Astrophysics Data System (ADS)

    Kubota, Yusuke; Pollard, Carl

    We propose a novel architecture for categorial grammar that clarifies the relationship between semantically relevant combinatoric reasoning and semantically inert reasoning that only affects surface-oriented phonological form. To this end, we employ a level of structured phonology that mediates between syntax (abstract combinatorics) and phonology proper (strings). To notate structured phonologies, we employ a lambda calculus analogous to the φ-terms of [8]. However, unlike Oehrle's purely equational φ-calculus, our phonological calculus is inequational, in a way that is strongly analogous to the functional programming language LCF [10]. Like LCF, our phonological terms are interpreted into a Henkin frame of posets, with degree of definedness ('height' in the preorder that interprets the base type) corresponding to degree of pronounceability; only maximal elements are actual strings and therefore fully pronounceable. We illustrate with an analysis (also new) of some complex constituent-order phenomena in Japanese.

  4. Direct interpretation of dreams: neuropsychology.

    PubMed

    van den Daele, L

    1996-09-01

    Although the role and importance of the interpretation of dreams has been de-emphasized in clinical discussions for the past several decades, new models of dream physiology suggest the central role and importance of dreams in the regulation of behavior. According to a body of current research, dreams potentiate new pathways of problem solving. A review of the neurophysiological literature pertinent to direct interpretation suggests dreams are sustained by midbrain anatomical networks with feed-back and feed-forward links to the cortex. The anatomical networks are termed the endogenous-intraorganismic system, the exogenous-transactional system, and the relational system that correspond to subjective, objective, and relational dreams in direct interpretation. Just as ordinary thought is the province of the dominant or left hemisphere, dreams are the province of the nondominant or right hemisphere. During REM states new pathways of problem solving are laid down by the nondominant hemisphere. In the awake state, thought and behavior about content that relates to dream material follow these pathways. The new neuropsychology of dreams reaffirms the central role of dreams in the organization of affect, emotion, intention, and general adaptation. PMID:8886217

  5. 8 CFR 1240.5 - Interpreter.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...

  6. 8 CFR 1240.5 - Interpreter.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...

  7. 8 CFR 1240.5 - Interpreter.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...

  8. 8 CFR 1240.5 - Interpreter.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...

  9. 8 CFR 1240.5 - Interpreter.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...

  10. 15 CFR 770.2 - Item interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 2 2012-01-01 2012-01-01 false Item interpretations. 770.2 Section 770.2 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE EXPORT ADMINISTRATION REGULATIONS INTERPRETATIONS § 770.2 Item interpretations. (a) Interpretation...

  11. University Interpreting: Linguistic Issues for Consideration.

    ERIC Educational Resources Information Center

    Napier, Jemina

    2002-01-01

    A study investigated 10 Auslan/English interpreters' use of translation style when interpreting for a university lecture. Results found the interpreters predominantly used a free or literal interpretation approach, but switched between translation styles at particular points of a text, leading to the suggestion of the concept of translational…

  12. Computer Interpretations of ECGs in Rural Hospitals

    PubMed Central

    Thompson, James M.

    1992-01-01

    Computer-assisted interpretation of electrocardiograms offers theoretical benefits to rural physicians. This study compared computer-assisted interpretations by a rural physician certified to read ECGs with interpretations by the computer alone. The computer interpretation alone could have led to major errors in patient management, but was correct sufficiently often to warrant purchase by small rural hospitals. PMID:21221365

  13. Statistical physics on the light-front

    SciTech Connect

    Raufeisen, J.

    2005-06-14

    The formulation of statistical physics using light-front quantization, instead of conventional equal-time boundary conditions, has important advantages for describing relativistic statistical systems, such as heavy ion collisions. We develop light-front field theory at finite temperature and density with special attention to Quantum Chromodynamics. We construct the most general form of the statistical operator allowed by the Poincare algebra and introduce the chemical potential in a covariant way. In light-front quantization, the Green's functions of a quark in a medium can be defined in terms of just 2-component spinors and does not lead to doublers in the transverse directions. A seminal property of light-front Green's functions is that they are related to parton densities in coordinate space. Namely, the diagonal and off-diagonal parton distributions measured in hard scattering experiments can be interpreted as light-front density matrices.

  14. Statistical methods for environmental pollution monitoring

    SciTech Connect

    Gilbert, R.O.

    1986-01-01

    This volume covers planning, design, and data analysis. It offers statistical methods for designing environmental sampling and monitoring programs as well as analyzing the resulting data. Statistical sample survey methods to problems of estimating average and total amounts of environmental pollution are presented in detail. The book also provides a broad array of statistical analysis methods for many purposes...numerous examples...three case studies...end-of-chapter questions...computer codes (showing what output looks like along with its interpretation)...a discussion of Kriging methods for estimating pollution concentration contours over space and/or time...nomographs for determining the number of samples required to detect hot spots with specified confidence...and a description and tables for conducting Rosner's test to identify outlaying (usually large) pollution measurements in a data set.

  15. Statistics at a glance.

    PubMed

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected. PMID:21302664

  16. The Sexual Experiences Survey: interpretation and validity.

    PubMed

    Karabatsos, G

    1997-01-01

    The Sexual Experiences Survey (Koss, Gidycz, & Wisniewski, 1987) is a commonly used instrument for assessing various degrees of sexual aggression and victimization among male offenders and female victims. Rasch analysis was used to transform qualitative raw score observations into objective linear measures using the responses of a national sample of 6,159 higher education men and women across the United States, aged 18-24. This paper supports the construct validity of the survey through evaluation of the item hierarchy, fit statistics, and separation indices. Findings confirm a "dimensional" perspective on rape, suggesting that sexually aggressive behaviors can be scaled along a single continuum from normal to extreme sexual behavior. The item hierarchy reveals an arrangement of sexually aggressive acts in an order of mild to severe, which compares with the one theorized by the authors of the SES. Identity plots demonstrate the validity of using a common set of SES item calibrations to measure both male and female respondents. For interpretation of person responses to the SES, three conclusions are suggested. First, Rasch analysis must be employed to examine item responses effectively. Second, when the survey is administered to a college sample aged 18-24, the item calibrations obtained in this paper can be used to measure offenders and victims. Third, a total raw score-to-measure conversion is not always sufficient to interpret person measures. Instead, a scalogram method needs be added to the Rasch analysis to separate the measures of offenders and victims who complete the survey. Implications for future research are discussed. PMID:9661726

  17. "Just Another Statistic"

    PubMed

    Machtay; Glatstein

    1998-01-01

    On returning from a medical meeting, we learned that sadly a patient, "Mr. B.," had passed away. His death was a completely unexpected surprise. He had been doing well nine months after a course of intensive radiotherapy for a locally advanced head and neck cancer; in his most recent follow-up notes, he was described as a "complete remission." Nonetheless, he apparently died peacefully in his sleep from a cardiac arrest one night and was found the next day by a concerned neighbor. In our absence, after Mr. B. expired, his death certificate was filled out by a physician who didn't know him in detail, but did know why he recently was treated in our department. The cause of death was listed as head and neck cancer. It wasn't long after his death before we began to receive those notorious "requests for additional information," letters from the statistical office of a well-known cooperative group. Mr. B., as it turns out, was on a clinical trial, and it was "vital" to know further details of the circumstances of his passing. Perhaps this very large cancer had been controlled and Mr. B. succumbed to old age (helped along by the tobacco industry). On the other hand, maybe the residual "fibrosis" in his neck was actually packed with active tumor and his left carotid artery was finally 100% pinched off, or maybe he suffered a massive pulmonary embolism from cancer-related hypercoagulability. The forms and requests were completed with a succinct "cause of death uncertain," adding, "please have the Study Chairs call to discuss this difficult case." Often clinical reports of outcomes utilize and emphasize the endpoint "disease specific survival" (DSS). Like overall survival (OS), the DSS can be calculated by actuarial methods, with patients who have incomplete follow-up "censored" at the time of last follow-up pending further information. In the DSS, however, deaths unrelated to the index cancer of interest are censored at the time of death; thus, a death from intercurrent disease is considered a "success" (to the investigator, that is; obviously, not to the patient and his or her family). The DSS rate will always be superior to the OS rate. Obviously, for any OS curve, if one waits long enough it will ultimately come to zero. There is thus a very logical rationale for reporting the DSS separately, particularly in diseases where death from intercurrent disease is expected to be common. Analyzing the DSS allows researchers to better compare the biologic efficacy of two or more cancer treatments, since it does not necessarily come to zero. Unlike some other endpoints, including local-regional control or freedom from progression, it takes into account the possibility of salvage therapy. DSS also focuses on an endpoint of interest to the public-death from cancer. In a recent popular media survey in which people were asked how they would choose to die if they could, 0% selected cancer. However, there are two serious potential problems with heavy dependence on the DSS. First, since patients who die from intercurrent disease are considered "cured," it seriously inflates the apparent effectiveness of a cancer treatment. Given the same biologic disease and the same treatment, the DSS as calculated in an old, sick population at high risk of intercurrent death will be better than the DSS in a younger, healthier population whose major risk is from their cancer. This problem has been discussed with respect to early stage prostate cancer, in which the conservative approach of observation has been criticized. The studies at issue rely heavily on the DSS, suggesting a comparable DSS (90% at 10 years) with "watchful waiting" to other researchers' results with aggressive therapy. The problem is that these series of conservative management focus on a patient population (as opposed to individuals) with a high risk of competing causes of mortality, which is very different from the population of patients generally treated with aggressive therapy (in which some have shown overall survivals superior to age-matched controls). It is fallacious and illogical to compare nonrandomized series of observation to those of aggressive therapy. In addition to the above problem, the use of DSS introduces another potential issue which we will call the bias of cause-of-death-interpretation. All statistical endpoints (e.g., response rates, local-regional control, freedom from brain metastases), except OS, are known to depend heavily on the methods used to define the endpoint and are often subject to significant interobserver variability. There is no reason to believe that this problem does not occasionally occur with respect to defining a death as due to the index cancer or to intercurrent disease, even though this issue has been poorly studied. In many oncologic situations-for example, metastatic lung cancer-this form of bias does not exist. In some situations, such as head and neck cancer, this could be an intermediate problem (Was that lethal chest tumor a second primary or a metastasis?.Would the fatal aspiration pneumonia have occurred if he still had a tongue?.And what about Mr. B. described above?). In some situations, particularly relatively "good prognosis" neoplasms, this could be a substantial problem, particularly if the adjudication of whether or not a death is cancer-related is performed solely by researchers who have an "interest" in demonstrating a good DSS. What we are most concerned about with this form of bias relates to recent series on observation, such as in early prostate cancer. It is interesting to note that although only 10% of the "observed" patients die from prostate cancer, many develop distant metastases by 10 years (approximately 40% among patients with intermediate grade tumors). Thus, it is implied that many prostate cancer metastases are usually not of themselves lethal, which is a misconception to anyone experienced in taking care of prostate cancer patients. This is inconsistent with U.S. studies of metastatic prostate cancer in which the median survival is two to three years. It is possible that many deaths attributed to intercurrent disease in "watchful waiting" series were in fact prostate cancer-related, perhaps related to failure to thrive, urosepsis, or pulmonary emboli. We will not know without an independent review of the medical records of individual patients; in some cases, even the most detailed review, sometimes even an autopsy, will not be conclusive. There are only a few data available describing the problems created by cause-of-death-interpretation bias. One small study, presented only in abstract form, assessed the cause of death in 50 randomly selected prostate cancer patients who died. Five experts in prostate cancer were asked to assign the cause of death as due to or not due to prostate cancer. The DSS varied from 21% to 35% among the five reviewers, a relative difference of 66%. Studies of autopsies, which are now rarely done in the U.S., have shown that fatal malignant tumors were occasionally missed by clinicians and-even more sobering-an occasional patient thought to have died from metastatic cancer is found to have no tumor but to have died from a "benign" cause such as TB. One study suggested an error rate of approximately 8%. Clearly the use of DSS is here to stay and is a useful adjunct to OS in analyzing randomized trials. There needs to be more research on the validity and interobserver reproducibility of the DSS. In the meantime, researchers should not report DSS without reporting OS and the reasons for intercurrent deaths should be described-peer reviewers should enforce this. As with so many other problems with statistics in the medical literature, it is the job of the reader to remain skeptical. The rate of intercurrent deaths in a study should reflect the age and demographics of the study population. If the DSS is far superior to the OS, the population being studied may be unusually sick (and thus unrealistic), or there may be a bias in classifying the causes of death. Similarly, if the DSS and OS are identical (unless a highly virulent malignancy is being studied), it may suggest the researchers have only included an unusually healthy (and thus unrealistic) patient population. Finally, we would also be a bit suspicious of a sizeable series that did not have any deaths that were considered of "uncertain" cause, unless the researchers specifically included them as being due to the cancer. We honestly think that everybody has a few patients like Mr. B. PMID:10388105

  18. Trichlorophenyl formate: highly reactive and easily accessible crystalline CO surrogate for palladium-catalyzed carbonylation of aryl/alkenyl halides and triflates.

    PubMed

    Ueda, Tsuyoshi; Konishi, Hideyuki; Manabe, Kei

    2012-10-19

    The high utility of 2,4,6-trichlorophenyl formate, a highly reactive and easily accessible crystalline CO surrogate, is demonstrated. The decarbonylation with NEt(3) to generate CO proceeded rapidly at rt, thereby allowing external-CO-free Pd-catalyzed carbonylation of aryl/alkenyl halides and triflates. The high reactivity of the CO surrogate enabled carbonylation at rt and significantly reduced the quantities of formate to near-stoichiometric levels. The obtained trichlorophenyl esters can be readily converted to a variety of carboxylic acid derivatives in high yields. PMID:23020164

  19. R.A. Fisher's contributions to genetical statistics.

    PubMed

    Thompson, E A

    1990-12-01

    R. A. Fisher (1890-1962) was a professor of genetics, and many of his statistical innovations found expression in the development of methodology in statistical genetics. However, whereas his contributions in mathematical statistics are easily identified, in population genetics he shares his preeminence with Sewall Wright (1889-1988) and J. B. S. Haldane (1892-1965). This paper traces some of Fisher's major contributions to the foundations of statistical genetics, and his interactions with Wright and with Haldane which contributed to the development of the subject. With modern technology, both statistical methodology and genetic data are changing. Nonetheless much of Fisher's work remains relevant, and may even serve as a foundation for future research in the statistical analysis of DNA data. For Fisher's work reflects his view of the role of statistics in scientific inference, expressed in 1949: There is no wide or urgent demand for people who will define methods of proof in set theory in the name of improving mathematical statistics. There is a widespread and urgent demand for mathematicians who understand that branch of mathematics known as theoretical statistics, but who are capable also of recognising situations in the real world to which such mathematics is applicable. In recognising features of the real world to which his models and analyses should be applicable, Fisher laid a lasting foundation for statistical inference in genetic analyses. PMID:2085639

  20. Hemophilia Data and Statistics

    MedlinePlus

    ... Hemophilia Women Healthcare Providers Partners Media Policy Makers Data & Statistics Language: English Español (Spanish) Recommend on Facebook ... at a very young age. Based on CDC data, the median age at diagnosis is 36 months ...

  1. Data and Statistics

    MedlinePlus

    ... Websites About Us Information For... Media Policy Makers Data & Statistics Language: English Español (Spanish) Recommend on Facebook ... with sickle cell disease (SCD) by matching up data from studies that monitor all people with SCD ...

  2. Cooperative Learning in Statistics.

    ERIC Educational Resources Information Center

    Keeler, Carolyn M.; And Others

    1994-01-01

    Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)

  3. Mental Illness Statistics

    MedlinePlus

    ... Secret Agent ID’d in Mice More Additional Mental Health Information from NIMH Medications Statistics Clinical Trials Coping ... Finder Publicaciones en Español The National Institute of Mental Health (NIMH) is part of the National Institutes of ...

  4. Titanic: A Statistical Exploration.

    ERIC Educational Resources Information Center

    Takis, Sandra L.

    1999-01-01

    Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)

  5. Uterine Cancer Statistics

    MedlinePlus

    ... Working Group. United States Cancer Statistics: 1999–2012 Incidence and Mortality Web-based Report. Atlanta (GA): Department of Health and Human Services, Centers for Disease Control and Prevention, and National Cancer Institute; 2015. ... ...

  6. Teaching Statistics with Minitab.

    ERIC Educational Resources Information Center

    Hubbard, Ruth

    1992-01-01

    Discusses the use of the computer software MINITAB in teaching statistics to explore concepts, simulate games of chance, transform the normal variable into a z-score, and stimulate small and large group discussions. (MDH)

  7. Understanding Solar Flare Statistics

    NASA Astrophysics Data System (ADS)

    Wheatland, M. S.

    2005-12-01

    A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.

  8. Purposeful Statistical Investigations

    ERIC Educational Resources Information Center

    Day, Lorraine

    2014-01-01

    Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.

  9. Interpretation of a compositional time series

    NASA Astrophysics Data System (ADS)

    Tolosana-Delgado, R.; van den Boogaart, K. G.

    2012-04-01

    Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA. In this data set, the proportion of annual precipitation falling in winter, spring, summer and autumn is considered a 4-component time series. Three invertible log-ratios are defined for calculations, balancing rainfall in autumn vs. winter, in summer vs. spring, and in autumn-winter vs. spring-summer. Results suggest a 2-year correlation range, and certain oscillatory behaviour in the last balance, which does not occur in the other two.

  10. Presenting the statistical results.

    PubMed

    Ng, K H; Peh, W C G

    2009-01-01

    Statistical methods are reported in a scientific paper to summarise the data that has been collected for a study and to enable its analysis. These methods should be described with enough detail to allow a knowledgeable reader who has access to the original data to verify the reported results. This article provides basic guidelines to aid authors in reporting the statistical aspects of the results of their studies clearly and accurately. PMID:19224078

  11. A new statistical tool for NOAA local climate studies

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.

    2011-12-01

    The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially-weighted moving average. Root mean squared error is used to determine the best fit of trend to the observations with the least error. The studies of climate variability impacts on local extremes use composite techniques applied to various definitions of local variables: from specified percentiles to critical thresholds. Drought studies combine visual capabilities of Google maps with statistical estimates of drought severity indices. The process of development will be linked to local office interactions with users to ensure the tool will meet their needs as well as provide adequate training. A rigorous internal and tiered peer-review process will be implemented to ensure the studies are scientifically-sound that will be published and submitted to the local studies catalog (database) and eventually to external sources, such as the Climate Portal.

  12. Transportation Statistics Annual Report 1997

    SciTech Connect

    Fenn, M.

    1997-01-01

    This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these accessibility patterns? How are commodity flows and transportation services responding to global competition, deregulation, economic restructuring, and new information technologies? How do U.S. patterns of personal mobility and freight movement compare with other advanced industrialized countries, formerly centrally planned economies, and major newly industrializing countries? Finally, how is the rapid adoption of new information technologies influencing the patterns of transportation demand and the supply of new transportation services? Indeed, how are information technologies affecting the nature and organization of transportation services used by individuals and firms?

  13. Genetics in geographically structured populations: defining, estimating and interpreting FST

    PubMed Central

    Holsinger, Kent E.; Weir, Bruce S.

    2015-01-01

    Wright’s F-statistics, and especially FST, provide important insights into the evolutionary processes that influence the structure of genetic variation within and among populations, and they are among the most widely used descriptive statistics in population and evolutionary genetics. Estimates of FST can identify regions of the genome that have been the target of selection, and comparisons of FST from different parts of the genome can provide insights into the demographic history of populations. For these reasons and others, FST has a central role in population and evolutionary genetics and has wide applications in fields that range from disease association mapping to forensic science. This Review clarifies how FST is defined, how it should be estimated, how it is related to similar statistics and how estimates of FST should be interpreted. PMID:19687804

  14. Statistical Physics of Fields

    NASA Astrophysics Data System (ADS)

    Kardar, Mehran

    2006-06-01

    While many scientists are familiar with fractals, fewer are familiar with the concepts of scale-invariance and universality which underly the ubiquity of their shapes. These properties may emerge from the collective behaviour of simple fundamental constituents, and are studied using statistical field theories. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook demonstrates how such theories are formulated and studied. Perturbation theory, exact solutions, renormalization groups, and other tools are employed to demonstrate the emergence of scale invariance and universality, and the non-equilibrium dynamics of interfaces and directed paths in random media are discussed. Ideal for advanced graduate courses in statistical physics, it contains an integrated set of problems, with solutions to selected problems at the end of the book. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873413. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 65 exercises, with solutions to selected problems Features a thorough introduction to the methods of Statistical Field theory Ideal for graduate courses in Statistical Physics

  15. Interpreting Arterial Blood Gases Successfully.

    PubMed

    Larkin, Brenda G; Zimmanck, Robert J

    2015-10-01

    Arterial blood gas (ABG) analysis is a crucial skill for perioperative nurses, in particular the RN circulator. This article provides the physiological basis for assessing ABGs perioperatively and presents a systematic approach to blood gas analysis using the Romanski method. Blood gas sample data allow the reader to practice ABG interpretation. In addition, four case studies are presented that give the reader the opportunity to analyze ABGs within the context of surgical patient scenarios. The ability to accurately assess ABGs allows the perioperative nurse to assist surgical team members in restoring a patient's acid-base balance. PMID:26411819

  16. Catalytic Nonoxidation Dehydrogenation of Ethane Over Fe-Ni Catalysts Supported on Mg (Al)O to Produce Hydrogen and Easily Purified Carbon Nanotubes

    SciTech Connect

    Shen,W.; Wang, Y.; Shi, X.; Shah, N.; Huggins, F.; Bollineni, S.; Seehra, M.; Huffman, G.

    2007-01-01

    Nonoxidative decomposition of ethane was conducted over monometallic Ni and bimetallic Fe-Ni catalysts on basic Mg(Al)O support to produce H2 free of CO and CO2 and easily purified carbon nanotubes, a potentially valuable byproduct. The Mg(Al)O support was prepared by calcination of synthetic MgAl-hydrotalcite with a Mg to Al ratio of 5. The catalysts were prepared by incipient wetness with total metal loadings of 5 wt %. The dehydrogenation of undiluted ethane was conducted at temperatures of 500, 650, and 700 C. At 500 C, the Ni/Mg(Al)O catalyst was highly active and very stable with 100% conversion of ethane to 20 vol % H2 and 80 vol % CH4. However, the bimetallic Fe-Ni/Mg(Al)O exhibited its best performance at 650 C, yielding 65 vol % H2, 10 vol % CH4, and 25 vol % unreacted ethane. The product carbon was in the form of carbon nanotubes (CNT) at all three reaction temperatures, but the morphology of the CNT depended on both the catalyst composition and reaction temperature. The CNTs were formed by a tip-growth mechanism over the Mg(Al)O supported catalysts and were easily purified by a one-step dilute nitric acid treatment. Mossbauer spectroscopy, X-ray absorption fine structure spectroscopy, N2 adsorption-desorption isotherms, TEM, STEM, TGA, and XRD were used to characterize the catalysts and the CNT, revealing the catalytic mechanisms.

  17. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    ERIC Educational Resources Information Center

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  18. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    ERIC Educational Resources Information Center

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic

  19. The Power of Statistical Tests for Moderators in Meta-Analysis

    ERIC Educational Resources Information Center

    Hedges, Larry V.; Pigott, Therese D.

    2004-01-01

    Calculation of the statistical power of statistical tests is important in planning and interpreting the results of research studies, including meta-analyses. It is particularly important in moderator analyses in meta-analysis, which are often used as sensitivity analyses to rule out moderator effects but also may have low statistical power. This…

  20. The interpretation of selection coefficients.

    PubMed

    Barton, N H; Servedio, M R

    2015-05-01

    Evolutionary biologists have an array of powerful theoretical techniques that can accurately predict changes in the genetic composition of populations. Changes in gene frequencies and genetic associations between loci can be tracked as they respond to a wide variety of evolutionary forces. However, it is often less clear how to decompose these various forces into components that accurately reflect the underlying biology. Here, we present several issues that arise in the definition and interpretation of selection and selection coefficients, focusing on insights gained through the examination of selection coefficients in multilocus notation. Using this notation, we discuss how its flexibility-which allows different biological units to be identified as targets of selection-is reflected in the interpretation of the coefficients that the notation generates. In many situations, it can be difficult to agree on whether loci can be considered to be under "direct" versus "indirect" selection, or to quantify this selection. We present arguments for what the terms direct and indirect selection might best encompass, considering a range of issues, from viability and sexual selection to kin selection. We show how multilocus notation can discriminate between direct and indirect selection, and describe when it can do so. PMID:25790030

  1. Conflicting Interpretations of Scientific Pedagogy

    NASA Astrophysics Data System (ADS)

    Galamba, Arthur

    2016-05-01

    Not surprisingly historical studies have suggested that there is a distance between concepts of teaching methods, their interpretations and their actual use in the classroom. This issue, however, is not always pitched to the personal level in historical studies, which may provide an alternative insight on how teachers conceptualise and engage with concepts of teaching methods. This article provides a case study on this level of conceptualisation by telling the story of Rómulo de Carvalho, an educator from mid-twentieth century Portugal, who for over 40 years engaged with the heuristic and Socratic methods. The overall argument is that concepts of teaching methods are open to different interpretations and are conceptualised within the melting pot of external social pressures and personal teaching preferences. The practice and thoughts of Carvalho about teaching methods are scrutinised to unveil his conflicting stances: Carvalho was a man able to question the tenets of heurism, but who publicly praised the heurism-like "discovery learning" method years later. The first part of the article contextualises the arrival of heurism in Portugal and how Carvalho attacked its philosophical tenets. In the second part, it dwells on his conflicting positions in relation to pupil-centred approaches. The article concludes with an appreciation of the embedded conflicting nature of the appropriation of concepts of teaching methods, and of Carvalho's contribution to the development of the philosophy of practical work in school science.

  2. Conflicting Interpretations of Scientific Pedagogy

    NASA Astrophysics Data System (ADS)

    Galamba, Arthur

    2016-03-01

    Not surprisingly historical studies have suggested that there is a distance between concepts of teaching methods, their interpretations and their actual use in the classroom. This issue, however, is not always pitched to the personal level in historical studies, which may provide an alternative insight on how teachers conceptualise and engage with concepts of teaching methods. This article provides a case study on this level of conceptualisation by telling the story of Rómulo de Carvalho, an educator from mid-twentieth century Portugal, who for over 40 years engaged with the heuristic and Socratic methods. The overall argument is that concepts of teaching methods are open to different interpretations and are conceptualised within the melting pot of external social pressures and personal teaching preferences. The practice and thoughts of Carvalho about teaching methods are scrutinised to unveil his conflicting stances: Carvalho was a man able to question the tenets of heurism, but who publicly praised the heurism-like "discovery learning" method years later. The first part of the article contextualises the arrival of heurism in Portugal and how Carvalho attacked its philosophical tenets. In the second part, it dwells on his conflicting positions in relation to pupil-centred approaches. The article concludes with an appreciation of the embedded conflicting nature of the appropriation of concepts of teaching methods, and of Carvalho's contribution to the development of the philosophy of practical work in school science.

  3. Directionality Effects in Simultaneous Language Interpreting: The Case of Sign Language Interpreters in the Netherlands

    ERIC Educational Resources Information Center

    van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of the Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives…

  4. How modern techniques improve seismic interpretation

    SciTech Connect

    Risch, D.L.; Chowdhury, A.N.; Hannan, A.E.; Jamieson, G.A. )

    1994-04-01

    Reflection seismology was first applied to hydrocarbon exploration in the 1920s and today is an integral part of oil and gas business. As technology evolves, more information is derived from seismic and used in many interpretation modes. Interpreter demands are also becoming greater as more data of higher quality is available for incorporation into integrated interpretation. Part 1 of this article describes the planning involved for seismic surveys, along with current methods and equipment used for acquisition, processing, display and interpretation of seismic data. Major points covered include: interpretation objectives; seismic acquisition, processing and display; structural interpretation organization and procedure.

  5. Biostratinomic utility of Archimedes in environmental interpretation

    SciTech Connect

    Wulff, J.I. )

    1990-04-01

    Biostratinomic information from the bryozoan Archimedes can be used to infer paleocurrent senses when other more traditional sedimentary structures are lacking. As with other elongate particles, Archimedes zooaria become oriented in the current and, upon settling, preserve a sense of the flow direction. Orientations and lengths were measured on over 200 individuals from bedding plane exposures in the Upper Mississippian Union Limestone (Greenbrier Group) of West Virginia. These were separated into long and short populations and plotted on rose diagrams. The results show that long and short segments become preferentially oriented in the current and the bimodally distributed long segments can be used to infer the current sense. The current sense is defined by the line which bisects the obtuse angle created by the two maxima in the rose diagram for long segments. Statistical evaluation of the long and short populations indicate they are significant to the 99.9 percent level. Elongate fossils such as Archimedes can be used in paleocurrent evaluations and can add more detail to the interpretation of paleodepositional conditions.

  6. Statistical Physics of Fracture

    SciTech Connect

    Alava, Mikko; Nukala, Phani K; Zapperi, Stefano

    2006-05-01

    Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.

  7. Direct interpretation of dreams: typology.

    PubMed

    van den Daele, L

    1992-12-01

    The dream typology assorts dreams into three major categories: dreams whose origin is endogenous, exogenous, or relational. Dreams of the first type arise from somatic needs, feelings, and states that accompany organismic adjustments to system requirements. Dreams of the second type are initiated by kinetic and dispositional tendencies toward engagement and exploration of the outer world. And dreams of the third type derive from interpersonal dispositions to interaction and relationship with other people. Within each category, dreams may occur at different levels of complexity. The dream typology permits the integration of psychoanalytic observations about the dreams from a variety of perspectives within a common framework. Freud's view that a dream is a wish fulfillment finds its primary niche in endogenous need, wish fulfillment, and convenience dreams. Kohut's observations about self-state dreams and inner regulation (1971, 1977) are accommodated to the middle range of endogenous dreams, and Jung's individuation dreams (1930) occupy the advanced range. Similarly, Bonime's interpersonal approach to dream interpretation (1962) is encompassed by relational dreams of the middle level. In addition, types and modes of dreams that are only infrequently encountered in clinical psychoanalysis are accommodated. The dream typology suggests that different psychoanalytic theories are like the position papers that might have derived from the fabled committee of learned blind who were commissioned to determine the appearance of an elephant. Each individual got a hold on some part, but could not see the whole; so for each, the part became the whole. The psychoanalytic theorist is in exactly an analogous position because, in fact, he is blind to the extent of the unconscious and is constrained to what he can infer. What he can infer depends on cohort, client population, and how he calibrates his observations. The result has been procrustean interpretation, dissention, and a remarkable stasis in the psychoanalytic theory of the unconscious. The theory of the unconscious that arises from the method of direct interpretation reflects a differentiated inner world with variegated landscapes of images and frameworks. The derivatives of the unconscious are determined by complex decision rules, symbol systems, and syntax. Images and dreams possess a primary autonomy from the conscious mind and arise through the configural mind, which serves the construction and synthesis of experience and knowledge. The derivatives emerge out of common human nature conjoined with concrete human experience. For this reason, dreams and images appear universal.(ABSTRACT TRUNCATED AT 400 WORDS) PMID:1489016

  8. Ukrainian-Speaking Migrants' Concerning the Use of Interpreters in Healthcare Service: A Pilot Study.

    PubMed

    Hadziabdic, Emina

    2016-01-01

    The aim of this pilot study was to investigate Ukrainian-speaking migrants' attitudes to the use of interpreters in healthcare service in order to test a developed questionnaire and recruitment strategy. A descriptive survey using a 51-item structured self-administered questionnaire of 12 Ukrainian-speaking migrants' and analyzed by the descriptive statistics. The findings were to have an interpreter as an objective communication and practical aid with personal qualities such as a good knowledge of languages and translation ability. In contrast, the clothes worn by the interpreter and the interpreter's religion were not viewed as important aspects. The findings support the method of a developed questionnaire and recruitment strategy, which in turn can be used in a larger planned investigation of the same topic in order to arrange a good interpretation situation in accordance with persons' desire irrespective of countries' different rules in healthcare policies regarding interpretation. PMID:27014391

  9. Comparison of a Novel Computerized Analysis Program and Visual Interpretation of Cardiotocography

    PubMed Central

    Chen, Chen-Yu; Yu, Chun; Chang, Chia-Chen; Lin, Chii-Wann

    2014-01-01

    Objective To compare a novel computerized analysis program with visual cardiotocography (CTG) interpretation results. Methods Sixty-two intrapartum CTG tracings with 20- to 30-minute sections were independently interpreted using a novel computerized analysis program, as well as the visual interpretations of eight obstetricians, to evaluate the baseline fetal heart rate (FHR), baseline FHR variability, number of accelerations, number/type of decelerations, uterine contraction (UC) frequency, and the National Institute of Child Health and Human Development (NICHD) 3-Tier FHR classification system. Results There was no significant difference in interobserver variation after adding the components of computerized analysis to results from the obstetricians' visual interpretations, with excellent agreement for the baseline FHR (ICC 0.91), the number of accelerations (ICC 0.85), UC frequency (ICC 0.97), and NICHD category I (kappa statistic 0.91); good agreement for baseline variability (kappa statistic 0.68), the numbers of early decelerations (ICC 0.78) and late decelerations (ICC 0.67), category II (kappa statistic 0.78), and overall categories (kappa statistic 0.80); and moderate agreement for the number of variable decelerations (ICC 0.60), and category III (kappa statistic 0.50). Conclusions This computerized analysis program is not inferior to visual interpretation, may improve interobserver variations, and could play a vital role in prenatal telemedicine. PMID:25437442

  10. Candidate Assembly Statistical Evaluation

    Energy Science and Technology Software Center (ESTSC)

    1998-07-15

    The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less

  11. Candidate Assembly Statistical Evaluation

    SciTech Connect

    Cude, B. W.

    1998-07-15

    The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that a significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.

  12. Statistical electron densities

    SciTech Connect

    Pipek, J.; Varga, I.

    1996-12-31

    It is known that in numerous interesting systems one-electron states appear with multifractal internal structure. Physical intuition suggest, however, that electron densities should be smooth both at atomic distances and close to the macroscopic limit. Multifractal behavior is expected at intermediate length scales, with observable non-trivial statistical properties in considerably, but far from macroscopically sized clusters. We have demonstrated that differences of generalized Renyi entropies serve as relevant quantities for the global characterization of the statistical nature of such electron densities. Asymptotic expansion formulas are elaborated for these values as functions of the length scale of observation. The transition from deterministic electron densities to statistical ones along various length of resolution is traced both theoretically and by numerical calculations.

  13. GSE statistics without spin

    NASA Astrophysics Data System (ADS)

    Joyner, Christopher H.; Müller, Sebastian; Sieber, Martin

    2014-09-01

    Energy level statistics following the Gaussian Symplectic Ensemble (GSE) of Random Matrix Theory have been predicted theoretically and observed numerically in numerous quantum chaotic systems. However, in all these systems there has been one unifying feature: the combination of half-integer spin and time-reversal invariance. Here we provide an alternative mechanism for obtaining GSE statistics that is derived from geometric symmetries of a quantum system which alleviates the need for spin. As an example, we construct a quantum graph with a discrete symmetry given by the quaternion group Q8 and observe GSE statistics within one of its subspectra. We then show how to isolate this subspectrum and construct a quantum graph with a scalar valued wave function and a pure GSE spectrum.

  14. Suite versus composite statistics

    USGS Publications Warehouse

    Balsillie, J.H.; Tanner, W.F.

    1999-01-01

    Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.

  15. Computer-interpretable Guideline Formalisms

    PubMed Central

    CLERCQ, Paul DE; KAISER, Katharina; HASMAN, Arie

    2010-01-01

    Implementing Computer-Interpretable Guidelines (CIGs) in active computer-based decision support systems promises to improve the acceptance and application of guidelines in daily practice. The model and underlying language are the core characteristics of every CIG approach. However, currently no standard model or language has been accepted by the CIG community. This aim of this chapter is to provide an overview of well-known approaches and to formulate a set of (minimal) requirements that can be used in the process of developing new CIG approaches or improving existing ones. It presents five CIG approaches (the Arden Syntax, GLIF, PROforma, Asbru and EON), followed by a general discussion of the strong points of each approach as well as their implications for future research. PMID:18806319

  16. Proverb interpretation changes in aging.

    PubMed

    Uekermann, Jennifer; Thoma, Patrizia; Daum, Irene

    2008-06-01

    Recent investigations have emphasized the involvement of fronto-subcortical networks to proverb comprehension. Although the prefrontal cortex is thought to be affected by normal aging, relatively little work has been carried out to investigate potential effects of aging on proverb comprehension. In the present investigation participants in three age groups were assessed on a proverb comprehension task and a range of executive function tasks. The older group showed impairment in selecting correct interpretations from alternatives. They also showed executive function deficits, as reflected by reduced working memory and deficient set shifting and inhibition abilities. The findings of the present investigation showed proverb comprehension deficits in normal aging which appeared to be related to reduced executive skills. PMID:18164527

  17. Interpreting neurodynamics: concepts and facts

    PubMed Central

    Rotter, Stefan

    2008-01-01

    The dynamics of neuronal systems, briefly neurodynamics, has developed into an attractive and influential research branch within neuroscience. In this paper, we discuss a number of conceptual issues in neurodynamics that are important for an appropriate interpretation and evaluation of its results. We demonstrate their relevance for selected topics of theoretical and empirical work. In particular, we refer to the notions of determinacy and stochasticity in neurodynamics across levels of microscopic, mesoscopic and macroscopic descriptions. The issue of correlations between neural, mental and behavioral states is also addressed in some detail. We propose an informed discussion of conceptual foundations with respect to neurobiological results as a viable step to a fruitful future philosophy of neuroscience. PMID:19003452

  18. Interpretation of rapidly rotating pulsars

    SciTech Connect

    Weber, F. . Inst. fuer Theoretische Physik); Glendenning, N.K. )

    1992-08-05

    The minimum possible rotational period of pulsars, which are interpreted as rotating neutron stars, is determined by applying a representative collection of realistic nuclear equations of state. It is found that none of the selected equations of state allows for neutron star rotation at periods below 0.8--0.9 ms. Thus, this work strongly supports the suggestion that if pulsars with shorter rotational periods were found, these are likely to be strange-quark-matter stars. The conclusion that the confined hadronic phase of nucleons and nuclei is only metastable would then be almost inescapable, and the plausible ground-state in that event is the deconfined phase of (3-flavor) strange-quark-matter.

  19. MS1, MS2, and SQT-three unified, compact, and easily parsed file formats for the storage of shotgun proteomic spectra and identifications.

    PubMed

    McDonald, W Hayes; Tabb, David L; Sadygov, Rovshan G; MacCoss, Michael J; Venable, John; Graumann, Johannes; Johnson, Jeff R; Cociorva, Daniel; Yates, John R

    2004-01-01

    As the speed with which proteomic labs generate data increases along with the scale of projects they are undertaking, the resulting data storage and data processing problems will continue to challenge computational resources. This is especially true for shotgun proteomic techniques that can generate tens of thousands of spectra per instrument each day. One design factor leading to many of these problems is caused by storing spectra and the database identifications for a given spectrum as individual files. While these problems can be addressed by storing all of the spectra and search results in large relational databases, the infrastructure to implement such a strategy can be beyond the means of academic labs. We report here a series of unified text file formats for storing spectral data (MS1 and MS2) and search results (SQT) that are compact, easily parsed by both machine and humans, and yet flexible enough to be coupled with new algorithms and data-mining strategies. PMID:15317041

  20. QUALITATIVE INTERPRETATION OF GALAXY SPECTRA

    SciTech Connect

    Sanchez Almeida, J.; Morales-Luis, A. B.; Terlevich, R.; Terlevich, E.; Cid Fernandes, R. E-mail: abml@iac.es E-mail: eterlevi@inaoep.mx

    2012-09-10

    We describe a simple step-by-step guide to qualitative interpretation of galaxy spectra. Rather than an alternative to existing automated tools, it is put forward as an instrument for quick-look analysis and for gaining physical insight when interpreting the outputs provided by automated tools. Though the recipe is for general application, it was developed for understanding the nature of the Automatic Spectroscopic K-means-based (ASK) template spectra. They resulted from the classification of all the galaxy spectra in the Sloan Digital Sky Survey data release 7, thus being a comprehensive representation of the galaxy spectra in the local universe. Using the recipe, we give a description of the properties of the gas and the stars that characterize the ASK classes, from those corresponding to passively evolving galaxies, to H II galaxies undergoing a galaxy-wide starburst. The qualitative analysis is found to be in excellent agreement with quantitative analyses of the same spectra. We compare the mean ages of the stellar populations with those inferred using the code STARLIGHT. We also examine the estimated gas-phase metallicity with the metallicities obtained using electron-temperature-based methods. A number of byproducts follow from the analysis. There is a tight correlation between the age of the stellar population and the metallicity of the gas, which is stronger than the correlations between galaxy mass and stellar age, and galaxy mass and gas metallicity. The galaxy spectra are known to follow a one-dimensional sequence, and we identify the luminosity-weighted mean stellar age as the affine parameter that describes the sequence. All ASK classes happen to have a significant fraction of old stars, although spectrum-wise they are outshined by the youngest populations. Old stars are metal-rich or metal-poor depending on whether they reside in passive galaxies or in star-forming galaxies.

  1. The wetland continuum: a conceptual framework for interpreting biological studies

    USGS Publications Warehouse

    Euliss, N.H., Jr.; LaBaugh, J.W.; Fredrickson, L.H.; Mushet, D.M.; Swanson, G.A.; Winter, T.C.; Rosenberry, D.O.; Nelson, R.D.

    2004-01-01

    We describe a conceptual model, the wetland continuum, which allows wetland managers, scientists, and ecologists to consider simultaneously the influence of climate and hydrologic setting on wetland biological communities. Although multidimensional, the wetland continuum is most easily represented as a two-dimensional gradient, with ground water and atmospheric water constituting the horizontal and vertical axis, respectively. By locating the position of a wetland on both axes of the continuum, the potential biological expression of the wetland can be predicted at any point in time. The model provides a framework useful in the organization and interpretation of biological data from wetlands by incorporating the dynamic changes these systems undergo as a result of normal climatic variation rather than placing them into static categories common to many wetland classification systems. While we developed this model from the literature available for depressional wetlands in the prairie pothole region of North America, we believe the concept has application to wetlands in many other geographic locations.

  2. Interpretation and knowledge of human rights in mental health practice.

    PubMed

    Dickens, Geoffrey; Sugarman, Philip

    The Human Rights Act is sometimes misunderstood as being an obstruction to the provision of safe and effective mental health care, allowing patients to cry 'human rights abuse' too easily. In reality, however, little is known about how human rights are protected and promoted in psychiatric care. This article provides an overview, for nurses, of how human rights are currently understood to be protected in mental health care and steps that could improve the protection of rights. Additionally, an overview of the relevant case law is presented to enable nurses to understand how human rights law is ever-evolving, how cases may be interpreted, and the implications that this has for mental health nursing practice. PMID:18563009

  3. Glaciation of northwestern Wyoming interpreted from ERTS-1

    NASA Technical Reports Server (NTRS)

    Breckenridge, R. M.

    1973-01-01

    Analysis of ERTS Imagery has shown a number of alpine glacial features can be recognized and mapped successfully. Although the Wyoming mountains are generally regarded as the type locality for Rocky Mountain glaciation some areas have not been studied from a glacial standpoint because of inaccessibility or lack of topographic control. ERTS imagery provides an excellent base for this type of regional geomorphic study. A map of maximum extent of Wisconsin Ice, flow directions and major glacial features was compiled from interpretation of the ERTS imagery. Features which can be mapped are large moraines, outwash fans and terraces. Present-day glaciers and snowfields are easily discriminated and mapped. Glaciers and glacial deposits which serve as aquifers play a significant role in the hydrologic cycle and are important because of the increasing demand placed on our water resources. ERTS provides a quick and effective method for change detection and inventory of these vital resources.

  4. The Interpreted Executive: Theory, Models, and Implications.

    ERIC Educational Resources Information Center

    Sussman, Lyle; Johnson, Denise M.

    1993-01-01

    Analyzes the interpreter's role in international business ventures. Presents descriptive models of the role, highlights major implications executives should consider before hiring an interpreter, and poses research questions based on these implications. (SR)

  5. Weighted order statistic classifiers with large rank-order margin.

    SciTech Connect

    Porter, R. B.; Hush, D. R.; Theiler, J. P.; Gokhale, M.

    2003-01-01

    We describe how Stack Filters and Weighted Order Statistic function classes can be used for classification problems. This leads to a new design criteria for linear classifiers when inputs are binary-valued and weights are positive . We present a rank-based measure of margin that can be directly optimized as a standard linear program and investigate its effect on generalization error with experiment. Our approach can robustly combine large numbers of base hypothesis and easily implement known priors through regularization.

  6. Statistical model with a standard Gamma distribution

    NASA Astrophysics Data System (ADS)

    Chakraborti, Anirban; Patriarca, Marco

    2005-03-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ. We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ. Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T (λ), where particles exchange energy in a space with an effective dimension D (λ).

  7. Statistical model with a standard Γ distribution

    NASA Astrophysics Data System (ADS)

    Patriarca, Marco; Chakraborti, Anirban; Kaski, Kimmo

    2004-07-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ . We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ . Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T(λ) , where particles exchange energy in a space with an effective dimension D(λ) .

  8. FIR statistics of paired galaxies

    NASA Astrophysics Data System (ADS)

    Sulentic, Jack W.

    1990-11-01

    Much progress has been made in understanding the effects of interaction on galaxies (see reviews in this volume by Heckman and Kennicutt). Evidence for enhanced emission from galaxies in pairs first emerged in the radio (Sulentic 1976) and optical (Larson and Tinsley 1978) domains. Results in the far infrared (FIR) lagged behind until the advent of the Infrared Astronomy Satellite (IRAS). The last five years have seen numerous FIR studies of optical and IR selected samples of interacting galaxies (e.g., Cutri and McAlary 1985; Joseph and Wright 1985; Kennicutt et al. 1987; Haynes and Herter 1988). Despite all of this work, there are still contradictory ideas about the level and, even, the reality of an FIR enhancement in interacting galaxies. Much of the confusion originates in differences between the galaxy samples that were studied (i.e., optical morphology and redshift coverage). Here, the authors report on a study of the FIR detection properties for a large sample of interacting galaxies and a matching control sample. They focus on the distance independent detection fraction (DF) statistics of the sample. The results prove useful in interpreting the previously published work. A clarification of the phenomenology provides valuable clues about the physics of the FIR enhancement in galaxies.

  9. FIR statistics of paired galaxies

    NASA Technical Reports Server (NTRS)

    Sulentic, Jack W.

    1990-01-01

    Much progress has been made in understanding the effects of interaction on galaxies (see reviews in this volume by Heckman and Kennicutt). Evidence for enhanced emission from galaxies in pairs first emerged in the radio (Sulentic 1976) and optical (Larson and Tinsley 1978) domains. Results in the far infrared (FIR) lagged behind until the advent of the Infrared Astronomy Satellite (IRAS). The last five years have seen numerous FIR studies of optical and IR selected samples of interacting galaxies (e.g., Cutri and McAlary 1985; Joseph and Wright 1985; Kennicutt et al. 1987; Haynes and Herter 1988). Despite all of this work, there are still contradictory ideas about the level and, even, the reality of an FIR enhancement in interacting galaxies. Much of the confusion originates in differences between the galaxy samples that were studied (i.e., optical morphology and redshift coverage). Here, the authors report on a study of the FIR detection properties for a large sample of interacting galaxies and a matching control sample. They focus on the distance independent detection fraction (DF) statistics of the sample. The results prove useful in interpreting the previously published work. A clarification of the phenomenology provides valuable clues about the physics of the FIR enhancement in galaxies.

  10. Proteny: discovering and visualizing statistically significant syntenic clusters at the proteome level

    PubMed Central

    Gehrmann, Thies; Reinders, Marcel J.T.

    2015-01-01

    Background: With more and more genomes being sequenced, detecting synteny between genomes becomes more and more important. However, for microorganisms the genomic divergence quickly becomes large, resulting in different codon usage and shuffling of gene order and gene elements such as exons. Results: We present Proteny, a methodology to detect synteny between diverged genomes. It operates on the amino acid sequence level to be insensitive to codon usage adaptations and clusters groups of exons disregarding order to handle diversity in genomic ordering between genomes. Furthermore, Proteny assigns significance levels to the syntenic clusters such that they can be selected on statistical grounds. Finally, Proteny provides novel ways to visualize results at different scales, facilitating the exploration and interpretation of syntenic regions. We test the performance of Proteny on a standard ground truth dataset, and we illustrate the use of Proteny on two closely related genomes (two different strains of Aspergillus niger) and on two distant genomes (two species of Basidiomycota). In comparison to other tools, we find that Proteny finds clusters with more true homologies in fewer clusters that contain more genes, i.e. Proteny is able to identify a more consistent synteny. Further, we show how genome rearrangements, assembly errors, gene duplications and the conservation of specific genes can be easily studied with Proteny. Availability and implementation: Proteny is freely available at the Delft Bioinformatics Lab website http://bioinformatics.tudelft.nl/dbl/software. Contact: t.gehrmann@tudelft.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26116928

  11. Two Interpretations of the Discrimination Parameter

    ERIC Educational Resources Information Center

    Tuerlinckx, Francis; De Boeck, Paul

    2005-01-01

    In this paper we propose two interpretations for the discrimination parameter in the two-parameter logistic model (2PLM). The interpretations are based on the relation between the 2PLM and two stochastic models. In the first interpretation, the 2PLM is linked to a diffusion model so that the probability of absorption equals the 2PLM. The…

  12. The Role of Interpreters in Inclusive Classrooms.

    ERIC Educational Resources Information Center

    Antia, Shirin D.; Kreimeyer, Kathryn H.

    2001-01-01

    A qualitative 3-year case study followed three deaf interpreters in an inclusive school. Results of interviews indicated that, in addition to sign interpreting, the interpreters clarified teacher directions, facilitated peer interaction, tutored the deaf children, and kept teachers and special educators informed of the deaf children's progress.…

  13. An Online Synchronous Test for Professional Interpreters

    ERIC Educational Resources Information Center

    Chen, Nian-Shing; Ko, Leong

    2010-01-01

    This article is based on an experiment designed to conduct an interpreting test for multiple candidates online, using web-based synchronous cyber classrooms. The test model was based on the accreditation test for Professional Interpreters produced by the National Accreditation Authority of Translators and Interpreters (NAATI) in Australia.

  14. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  15. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  16. 10 CFR 7.3 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 7.3 Section 7.3 Energy NUCLEAR REGULATORY COMMISSION ADVISORY COMMITTEES § 7.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an NRC officer or...

  17. 10 CFR 110.3 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 110.3 Section 110.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL General Provisions § 110.3 Interpretations. Except as authorized by the Commission in writing, no interpretation of...

  18. 10 CFR 7.3 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 7.3 Section 7.3 Energy NUCLEAR REGULATORY COMMISSION ADVISORY COMMITTEES § 7.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an NRC officer or...

  19. 10 CFR 110.3 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Interpretations. 110.3 Section 110.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL General Provisions § 110.3 Interpretations. Except as authorized by the Commission in writing, no interpretation of...

  20. 10 CFR 25.7 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 25.7 Section 25.7 Energy NUCLEAR REGULATORY COMMISSION ACCESS AUTHORIZATION General Provisions § 25.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part...

  1. 10 CFR 110.3 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Interpretations. 110.3 Section 110.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL General Provisions § 110.3 Interpretations. Except as authorized by the Commission in writing, no interpretation of...

  2. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  3. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...

  4. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...

  5. 10 CFR 110.3 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 110.3 Section 110.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL General Provisions § 110.3 Interpretations. Except as authorized by the Commission in writing, no interpretation of...

  6. 10 CFR 7.3 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 7.3 Section 7.3 Energy NUCLEAR REGULATORY COMMISSION ADVISORY COMMITTEES § 7.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an NRC officer or...

  7. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...

  8. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  9. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  10. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...

  11. 10 CFR 7.3 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 7.3 Section 7.3 Energy NUCLEAR REGULATORY COMMISSION ADVISORY COMMITTEES § 7.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an NRC officer or...

  12. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...

  13. 10 CFR 7.3 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 7.3 Section 7.3 Energy NUCLEAR REGULATORY COMMISSION ADVISORY COMMITTEES § 7.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an NRC officer or...

  14. 10 CFR 110.3 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 110.3 Section 110.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL General Provisions § 110.3 Interpretations. Except as authorized by the Commission in writing, no interpretation of...

  15. The transactional interpretation of quantum mechanics

    NASA Astrophysics Data System (ADS)

    Cramer, John G.

    1986-07-01

    The interpretational problems of quantum mechanics are considered. The way in which the standard Copenhagen interpretation of quantum mechanics deals with these problems is reviewed. A new interpretation of the formalism of quantum mechanics, the transactional interpretation, is presented. The basic element of this interpretation is the transaction describing a quantum event as an exchange of advanced and retarded waves, as implied by the work of Wheeler and Feynman, Dirac, and others. The transactional interpretation is explicitly nonlocal and thereby consistent with recent tests of the Bell inequality, yet is relativistically invariant and fully causal. A detailed comparison of the transactional and Copenhagen interpretations is made in the context of well-known quantum-mechanical Gedankenexperimente and "paradoxes." The transactional interpretation permits quantum-mechanical wave functions to be interpreted as real waves physically present in space rather than as "mathematical representations of knowledge" as in the Copenhagen interpretation. The transactional interpretation is shown to provide insight into the complex character of the quantum-mechanical state vector and the mechanism associated with its "collapse." It also leads in a natural way to justification of the Heisenberg uncertainty principle and the Born probability law (P=ψψ*), basic elements of the Copenhagen interpretation.

  16. 10 CFR 76.6 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  17. 10 CFR 76.6 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  18. 10 CFR 76.6 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  19. 10 CFR 76.6 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  20. 10 CFR 76.6 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  1. Two Interpretations of the Discrimination Parameter

    ERIC Educational Resources Information Center

    Tuerlinckx, Francis; De Boeck, Paul

    2005-01-01

    In this paper we propose two interpretations for the discrimination parameter in the two-parameter logistic model (2PLM). The interpretations are based on the relation between the 2PLM and two stochastic models. In the first interpretation, the 2PLM is linked to a diffusion model so that the probability of absorption equals the 2PLM. The

  2. Using Playing Cards to Differentiate Probability Interpretations

    ERIC Educational Resources Information Center

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  3. An Online Synchronous Test for Professional Interpreters

    ERIC Educational Resources Information Center

    Chen, Nian-Shing; Ko, Leong

    2010-01-01

    This article is based on an experiment designed to conduct an interpreting test for multiple candidates online, using web-based synchronous cyber classrooms. The test model was based on the accreditation test for Professional Interpreters produced by the National Accreditation Authority of Translators and Interpreters (NAATI) in Australia.…

  4. 10 CFR 20.1006 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 20.1006 Section 20.1006 Energy NUCLEAR REGULATORY COMMISSION STANDARDS FOR PROTECTION AGAINST RADIATION General Provisions 20.1006 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of...

  5. 10 CFR 26.7 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...

  6. 10 CFR 26.7 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...

  7. 10 CFR 26.7 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...

  8. 10 CFR 26.7 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...

  9. 10 CFR 26.7 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...

  10. Comprehension and Error Monitoring in Simultaneous Interpreters

    ERIC Educational Resources Information Center

    Yudes, Carolina; Macizo, Pedro; Morales, Luis; Bajo, M. Teresa

    2013-01-01

    In the current study we explored lexical, syntactic, and semantic processes during text comprehension in English monolinguals and Spanish/English (first language/second language) bilinguals with different experience in interpreting (nontrained bilinguals, interpreting students and professional interpreters). The participants performed an…

  11. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  12. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  13. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  14. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  15. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  16. 10 CFR 60.5 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 60.5 Section 60.5 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES General Provisions § 60.5 Interpretations. Except as specifically authorized by the Commission, in writing, no interpretation of the meaning of...

  17. Court Interpreting: The Anatomy of a Profession.

    ERIC Educational Resources Information Center

    de Jongh, Elena M.

    For both translators and interpreters, language proficiency is only the starting point for professional work. The equivalence of both meaning and style are necessary for faithful translation. The legal interpreter or translator must understand the complex characteristics and style of legal language. Court interpreting is a relatively young…

  18. Sex differences in interpretation bias in adolescents.

    PubMed

    Gluck, Rachel L; Lynn, Debra A; Dritschel, Barbara; Brown, Gillian R

    2014-03-01

    Interpretation biases, in which ambiguous information is interpreted negatively, have been hypothesized to place adolescent females at greater risk of developing anxiety and mood disorders than same-aged males. We tested the hypothesis that adolescent girls interpret ambiguous scenarios more negatively, and/or less positively, than same-aged males using the Adolescent Interpretation and Belief Questionnaire (N = 67, 11-15 years old). We also tested whether adolescent girls and boys differed in judging positive or negative interpretations to be more believable and whether the scenario content (social vs. non-social) affected any sex difference in interpretation bias. The results showed that girls had higher average negative interpretation scores than boys, with no sex differences in positive interpretation scores. Girls and boys did not differ on which interpretation they found to be most believable. Both sexes reported that positive interpretations were less likely to come to mind, and were less believable, for social than for non-social scenarios. These results provide preliminary evidence for sex differences in interpretation biases in adolescence and support the hypothesis that social scenarios are a specific source of anxiety to this age group. A greater understanding of the aetiology of interpretation biases will potentially enhance sex- and age-specific interventions for anxiety and mood disorders. PMID:24417225

  19. 10 CFR 73.3 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 73.3 Section 73.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PHYSICAL PROTECTION OF PLANTS AND MATERIALS General Provisions § 73.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretations of the...

  20. 10 CFR 73.3 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 73.3 Section 73.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PHYSICAL PROTECTION OF PLANTS AND MATERIALS General Provisions § 73.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretations of the...