Science.gov

Sample records for easily interpretable statistics

  1. An easily extendable interpreter comprising MCA and CAMAC commands

    NASA Astrophysics Data System (ADS)

    Bakkum, E. L.; Elsenaar, R. J.

    1984-12-01

    A BASIC interpreter is a useful tool for writing small programs in a quick and easy way. For the control of experiments, however, it lacks a number of essential features. A BASIC-like command interpreter BACO has therefore been developed. It runs on PDP-11 computers with the RSX-11M operating system. Its major advantages over BASIC are: (1) new FORTRAN routines can be implemented simply, and (2) interrupts can be processed at interpreter level. As an application the implementation is described of routines to control a CAMAC system and of a multichannel analyzer simulation. The CAMAC commands follow the line proposed by the ESONE committee. Since an interpreter is inherently rather slow, the commands are intended for moderately fast data transfer and interrupt handling, which suffices for the control of many experiments.

  2. An interpretation of cloud overlap statistics

    NASA Astrophysics Data System (ADS)

    Tompkins, Andrian; Di Giuseppe, Francesca

    2015-04-01

    {Previous studies using ground-based and satellite observations show that the total cloud cover of cloudy layers separated by clear sky is close to, but can statistically exceed that given by the random overlap assumption, suggesting a tendency towards minimum overlap. In addition, vertically continuous clouds which are maximally overlapped in adjacent layers, decorrelate as the separation distance increases, with the resulting decorrelation length-scale found to be sensitive to the horizontal scale of the cloud scenes used to conduct the analysis. No satisfactory explanation has been given for the minimal overlap and scene-scale sensitivity of the cloud statistics. Using simple heuristic arguments, it is suggested that both these phenomena can be expected due to the statistical truncation that results from the omission of overcast cloudy layers from the analysis, which occurs more frequently as the scene length falls progressively below the typical cloud system scale. We first validate this claim using a easily interpreted system of repeating cyclic clouds sampled at various lengthscales, which reproduces both of the above phenoma. This analysis is then repeated with realistic fractal clouds from a cloud generator, which demonstrates that the degree of minimal overlap diagnosed in previous studies for is continuous clouds would result from sampling randomly overlapped clouds at spatial scales that are 30% to 80% of the cloud system scale. Based on this, a simple filter is suggested for cloudy scenes which removes the diagnosis of minimal overlap for discontinuous clouds, and results in a scene-length invariant calculation of the cloud overlap decorrelation for continuous clouds. Using CloudSat-CALIPSO data for 6 months, a scale-invariant decorrelation lengthscale of 3.7km is found. Using this filter we analyse a special application. By processing more than eight million cloud scenes from CloudSat observation in conjunction with co-located ECMWF analysis data we identify an empirical relationship between cloud overlap and wind-shear that can be applied to global models with confidence. The analysis confirms that clouds separated by clear sky gaps are randomly overlapped while continuous cloud layers decorrelate from maximum towards random overlap as the separation distance increases. There is a clear and systematic impact of wind-shear on the decorrelation length-scale, with cloud decorrelating over smaller distances as wind shear increases, as expected. A simple empirical linear-fit parametrisation is suggested that is straightforward to add to existing radiation schemes.

  3. Interpreting statistics of small lunar craters

    NASA Technical Reports Server (NTRS)

    Schultz, P. H.; Gault, D.; Greeley, R.

    1977-01-01

    Some of the wide variations in the crater-size distributions in lunar photography and in the resulting statistics were interpreted as different degradation rates on different surfaces, different scaling laws in different targets, and a possible population of endogenic craters. These possibilities are reexamined for statistics of 26 different regions. In contrast to most other studies, crater diameters as small as 5 m were measured from enlarged Lunar Orbiter framelets. According to the results of the reported analysis, the different crater distribution types appear to be most consistent with the hypotheses of differential degradation and a superposed crater population. Differential degradation can account for the low level of equilibrium in incompetent materials such as ejecta deposits, mantle deposits, and deep regoliths where scaling law changes and catastrophic processes introduce contradictions with other observations.

  4. Interpreting Educational Research Using Statistical Software.

    ERIC Educational Resources Information Center

    Evans, Elizabeth A.

    A live demonstration of how a typical set of educational data can be examined using quantitative statistical software was conducted. The topic of tutorial support was chosen. Setting up a hypothetical research scenario, the researcher created 300 cases from random data generation adjusted to correct obvious error. Each case represented a student…

  5. The Statistical Interpretation of Entropy: An Activity

    ERIC Educational Resources Information Center

    Timmberlake, Todd

    2010-01-01

    The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…

  6. The Statistical Interpretation of Entropy: An Activity

    ERIC Educational Resources Information Center

    Timmberlake, Todd

    2010-01-01

    The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…

  7. The Statistical Interpretation of Entropy: An Activity

    NASA Astrophysics Data System (ADS)

    Timmberlake, Todd

    2010-11-01

    The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the functioning of the second law and also provided evidence for the existence of atoms at a time when many scientists (like Ernst Mach and Wilhelm Ostwald) were skeptical.

  8. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  9. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  10. Statistical Interpretation of Natural and Technological Hazards in China

    NASA Astrophysics Data System (ADS)

    Borthwick, Alistair, ,, Prof.; Ni, Jinren, ,, Prof.

    2010-05-01

    China is prone to catastrophic natural hazards from floods, droughts, earthquakes, storms, cyclones, landslides, epidemics, extreme temperatures, forest fires, avalanches, and even tsunami. This paper will list statistics related to the six worst natural disasters in China over the past 100 or so years, ranked according to number of fatalities. The corresponding data for the six worst natural disasters in China over the past decade will also be considered. [The data are abstracted from the International Disaster Database, Centre for Research on the Epidemiology of Disasters (CRED), Université Catholique de Louvain, Brussels, Belgium, http://www.cred.be/ where a disaster is defined as occurring if one of the following criteria is fulfilled: 10 or more people reported killed; 100 or more people reported affected; a call for international assistance; or declaration of a state of emergency.] The statistics include the number of occurrences of each type of natural disaster, the number of deaths, the number of people affected, and the cost in billions of US dollars. Over the past hundred years, the largest disasters may be related to the overabundance or scarcity of water, and to earthquake damage. However, there has been a substantial relative reduction in fatalities due to water related disasters over the past decade, even though the overall numbers of people affected remain huge, as does the economic damage. This change is largely due to the efforts put in by China's water authorities to establish effective early warning systems, the construction of engineering countermeasures for flood protection, the implementation of water pricing and other measures for reducing excessive consumption during times of drought. It should be noted that the dreadful death toll due to the Sichuan Earthquake dominates recent data. Joint research has been undertaken between the Department of Environmental Engineering at Peking University and the Department of Engineering Science at Oxford University on the production of zonation maps of certain natural hazards in China. Data at city and county level have been interpreted using a hierarchical system of indices, which are then ranked according to severity. Zonation maps will be presented for debris flows, landslide and rockfall hazards, flood risk in mainland China, and for soil erosion processes in the Yellow River basin. The worst debris flow hazards are to be found in southwest China as the land begins to become mountainous. Just over 20% of the land area is at high or very high risk of landslide and rockfall hazards, especially Yunnan, Sichuan, Gansu and Shannxi provinces. Flood risk is concentrated towards the eastern part of China, where the major rivers meet the sea. The paper will also consider data on technological disasters in China from 1900 to 2010, using data supplied by CRED. In terms of fatalities, industrial accidents appear to be dominated by explosion events. However, gas leaks have affected the largest number of people. Transport accidents are ranked in terms of fatalities as follows: water - road - rail - air. Fire is a major cause of loss of life, whereas chemical spills and poisoning seem to lead to fewer deaths.

  11. A Critique of Divorce Statistics and Their Interpretation.

    ERIC Educational Resources Information Center

    Crosby, John F.

    1980-01-01

    Increasingly, appeals to the divorce statistic are employed to substantiate claims that the family is in a state of breakdown and marriage is passe. This article contains a consideration of reasons why the divorce statistics are invalid and/or unreliable as indicators of the present state of marriage and family. (Author)

  12. Graph Interpretation Aspects of Statistical Literacy: A Japanese Perspective

    ERIC Educational Resources Information Center

    Aoyama, Kazuhiro; Stephens, Max

    2003-01-01

    Many educators and researchers are trying to define statistical literacy for the 21st century. Kimura, a Japanese science educator, has suggested that a key task of statistical literacy is the ability to extract qualitative information from quantitative information, and/or to create new information from qualitative and quantitative information.…

  13. Workplace Statistical Literacy for Teachers: Interpreting Box Plots

    ERIC Educational Resources Information Center

    Pierce, Robyn; Chick, Helen

    2013-01-01

    As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…

  14. A novel statistical analysis and interpretation of flow cytometry data

    PubMed Central

    Banks, H.T.; Kapraun, D.F.; Thompson, W. Clayton; Peligero, Cristina; Argilaguet, Jordi; Meyerhans, Andreas

    2013-01-01

    A recently developed class of models incorporating the cyton model of population generation structure into a conservation-based model of intracellular label dynamics is reviewed. Statistical aspects of the data collection process are quantified and incorporated into a parameter estimation scheme. This scheme is then applied to experimental data for PHA-stimulated CD4+ T and CD8+ T cells collected from two healthy donors. This novel mathematical and statistical framework is shown to form the basis for accurate, meaningful analysis of cellular behaviour for a population of cells labelled with the dye carboxyfluorescein succinimidyl ester and stimulated to divide. PMID:23826744

  15. Statistical characteristics of MST radar echoes and its interpretation

    NASA Technical Reports Server (NTRS)

    Woodman, Ronald F.

    1989-01-01

    Two concepts of fundamental importance are reviewed: the autocorrelation function and the frequency power spectrum. In addition, some turbulence concepts, the relationship between radar signals and atmospheric medium statistics, partial reflection, and the characteristics of noise and clutter interference are discussed.

  16. Statistical Interpretation of the Local Field Inside Dielectrics.

    ERIC Educational Resources Information Center

    Berrera, Ruben G.; Mello, P. A.

    1982-01-01

    Compares several derivations of the Clausius-Mossotti relation to analyze consistently the nature of approximations used and their range of applicability. Also presents a statistical-mechanical calculation of the local field for classical system of harmonic oscillators interacting via the Coulomb potential. (Author/SK)

  17. Interpretation of gamma-ray burst source count statistics

    NASA Technical Reports Server (NTRS)

    Petrosian, Vahe

    1993-01-01

    Ever since the discovery of gamma-ray bursts, the so-called log N-log S relation has been used for determination of their distances and distribution. This task has not been straightforward because of varying thresholds for the detection of bursts. Most of the current analyses of these data are couched in terms of ambiguous distributions, such as the distribution of Cp/Clim, the ratio of peak to threshold photon count rates, or the distribution of V/Vmax = (Cp/Clim) exp -3/2. It is shown that these distributions are not always a true reflection of the log N-log S relation. Some kind of deconvolution is required for obtaining the true log N-log S. Therefore, care is required in the interpretation of results of such analyses. A new method of analysis of these data is described, whereby the bivariate distribution of Cp and Clim is obtained directly from the data.

  18. Interpreting the flock algorithm from a statistical perspective.

    PubMed

    Anderson, Eric C; Barry, Patrick D

    2015-09-01

    We show that the algorithm in the program flock (Duchesne & Turgeon 2009) can be interpreted as an estimation procedure based on a model essentially identical to the structure (Pritchard et al. 2000) model with no admixture and without correlated allele frequency priors. Rather than using MCMC, the flock algorithm searches for the maximum a posteriori estimate of this structure model via a simulated annealing algorithm with a rapid cooling schedule (namely, the exponent on the objective function →∞). We demonstrate the similarities between the two programs in a two-step approach. First, to enable rapid batch processing of many simulated data sets, we modified the source code of structure to use the flock algorithm, producing the program flockture. With simulated data, we confirmed that results obtained with flock and flockture are very similar (though flockture is some 200 times faster). Second, we simulated multiple large data sets under varying levels of population differentiation for both microsatellite and SNP genotypes. We analysed them with flockture and structure and assessed each program on its ability to cluster individuals to their correct subpopulation. We show that flockture yields results similar to structure albeit with greater variability from run to run. flockture did perform better than structure when genotypes were composed of SNPs and differentiation was moderate (FST= 0.022-0.032). When differentiation was low, structure outperformed flockture for both marker types. On large data sets like those we simulated, it appears that flock's reliance on inference rules regarding its 'plateau record' is not helpful. Interpreting flock's algorithm as a special case of the model in structure should aid in understanding the program's output and behaviour. PMID:25913195

  19. A statistical model for interpreting computerized dynamic posturography data

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.

    2002-01-01

    Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.

  20. Impact of Equity Models and Statistical Measures on Interpretations of Educational Reform

    ERIC Educational Resources Information Center

    Rodriguez, Idaykis; Brewe, Eric; Sawtelle, Vashti; Kramer, Laird H.

    2012-01-01

    We present three models of equity and show how these, along with the statistical measures used to evaluate results, impact interpretation of equity in education reform. Equity can be defined and interpreted in many ways. Most equity education reform research strives to achieve equity by closing achievement gaps between groups. An example is given…

  1. On the physical interpretation of statistical data from black-box systems

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo I.; Cohen, Morrel H.

    2013-07-01

    In this paper we explore the physical interpretation of statistical data collected from complex black-box systems. Given the output statistics of a black-box system, and considering a class of relevant Markov dynamics which are physically meaningful, we reverse-engineer the Markov dynamics to obtain an equilibrium distribution that coincides with the output statistics observed. This reverse-engineering scheme provides us with a conceptual physical interpretation of the black-box system investigated. Five specific reverse-engineering methodologies are developed, based on the following dynamics: Langevin, geometric Langevin, diffusion, growth-collapse, and decay-surge. In turn, these methodologies yield physical interpretations of the black-box system in terms of conceptual intrinsic forces, temperatures, and instabilities. The application of these methodologies is exemplified in the context of the distribution of wealth and income in human societies, which are outputs of the complex black-box system called “the economy”.

  2. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    ERIC Educational Resources Information Center

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate unless "corrected" effect…

  3. Material Phase Causality or a Dynamics-Statistical Interpretation of Quantum Mechanics

    SciTech Connect

    Koprinkov, I. G.

    2010-11-25

    The internal phase dynamics of a quantum system interacting with an electromagnetic field is revealed in details. Theoretical and experimental evidences of a causal relation of the phase of the wave function to the dynamics of the quantum system are presented sistematically for the first time. A dynamics-statistical interpretation of the quantum mechanics is introduced.

  4. Report: New analytical and statistical approaches for interpreting the relationships among environmental stressors and biomarkers

    EPA Science Inventory

    The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical, and statistical perspectives (Pleil et al. 2014; Sobus et al. 2011...

  5. New physicochemical interpretations for the adsorption of food dyes on chitosan films using statistical physics treatment.

    PubMed

    Dotto, G L; Pinto, L A A; Hachicha, M A; Knani, S

    2015-03-15

    In this work, statistical physics treatment was employed to study the adsorption of food dyes onto chitosan films, in order to obtain new physicochemical interpretations at molecular level. Experimental equilibrium curves were obtained for the adsorption of four dyes (FD&C red 2, FD&C yellow 5, FD&C blue 2, Acid Red 51) at different temperatures (298, 313 and 328 K). A statistical physics formula was used to interpret these curves, and the parameters such as, number of adsorbed dye molecules per site (n), anchorage number (n'), receptor sites density (NM), adsorbed quantity at saturation (N asat), steric hindrance (?), concentration at half saturation (c1/2) and molar adsorption energy (?E(a)) were estimated. The relation of the above mentioned parameters with the chemical structure of the dyes and temperature was evaluated and interpreted. PMID:25308634

  6. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  7. Two Easily Made Astronomical Telescopes.

    ERIC Educational Resources Information Center

    Hill, M.; Jacobs, D. J.

    1991-01-01

    The directions and diagrams for making a reflecting telescope and a refracting telescope are presented. These telescopes can be made by students out of plumbing parts and easily obtainable, inexpensive, optical components. (KR)

  8. An Easily Constructed Dodecahedron Model.

    ERIC Educational Resources Information Center

    Yamana, Shukichi

    1984-01-01

    A model of a dodecahedron which is necessary for teaching stereochemistry (for example, that of dodecahedrane) can be made easily by using a sealed, empty envelope. The steps necessary for accomplishing this task are presented. (JN)

  9. An Easily Constructed Cube Model.

    ERIC Educational Resources Information Center

    Yamana, Shukichi; Kawaguchi, Makoto

    1984-01-01

    A model of a cube which is necessary for teaching stereochemistry (especially of inorganic compounds) can be made easily, by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  10. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  11. Editorial: new analytical and statistical approaches for interpreting the relationships among environmental stressors and biomarkers.

    PubMed

    Bean, Heather D; Pleil, Joachim D; Hill, Jane E

    2015-02-01

    The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical and statistical perspectives. A second concern is how the environment interacts with human systems biology, what the variability is in "normal" subjects, and how such biological observations might be reconstructed to infer external stressors. In this article, we report on recent research presentations from a symposium at the 248th American Chemical Society meeting held in San Francisco, 10-14 August 2014, that focused on providing some insight into these important issues. PMID:25444302

  12. Misuse of statistics in the interpretation of data on low-level radiation

    SciTech Connect

    Hamilton, L.D.

    1982-01-01

    Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.

  13. Soil VisNIR chemometric performance statistics should be interpreted as random variables

    NASA Astrophysics Data System (ADS)

    Brown, David J.; Gasch, Caley K.; Poggio, Matteo; Morgan, Cristine L. S.

    2015-04-01

    Chemometric models are normally evaluated using performance statistics such as the Standard Error of Prediction (SEP) or the Root Mean Squared Error of Prediction (RMSEP). These statistics are used to evaluate the quality of chemometric models relative to other published work on a specific soil property or to compare the results from different processing and modeling techniques (e.g. Partial Least Squares Regression or PLSR and random forest algorithms). Claims are commonly made about the overall success of an application or the relative performance of different modeling approaches assuming that these performance statistics are fixed population parameters. While most researchers would acknowledge that small differences in performance statistics are not important, rarely are performance statistics treated as random variables. Given that we are usually comparing modeling approaches for general application, and given that the intent of VisNIR soil spectroscopy is to apply chemometric calibrations to larger populations than are included in our soil-spectral datasets, it is more appropriate to think of performance statistics as random variables with variation introduced through the selection of samples for inclusion in a given study and through the division of samples into calibration and validation sets (including spiking approaches). Here we look at the variation in VisNIR performance statistics for the following soil-spectra datasets: (1) a diverse US Soil Survey soil-spectral library with 3768 samples from all 50 states and 36 different countries; (2) 389 surface and subsoil samples taken from US Geological Survey continental transects; (3) the Texas Soil Spectral Library (TSSL) with 3000 samples; (4) intact soil core scans of Texas soils with 700 samples; (5) approximately 400 in situ scans from the Pacific Northwest region; and (6) miscellaneous local datasets. We find the variation in performance statistics to be surprisingly large. This has important implications for the interpretation of soil VisNIR model results. Particularly for smaller datasets, the relative success of a given application or modeling approach may well be due in part to chance.

  14. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    NASA Astrophysics Data System (ADS)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  15. Crossing statistic: Bayesian interpretation, model selection and resolving dark energy parametrization problem

    SciTech Connect

    Shafieloo, Arman

    2012-05-01

    By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties.

  16. Differences in paleomagnetic interpretations due to the choice of statistical, demagnetization and correction techniques: Kapuskasing Structural Zone, northern Ontario, Canada

    NASA Astrophysics Data System (ADS)

    Borradaile, Graham J.; Werner, Tomasz; Lagroix, France

    2003-02-01

    The Kapuskasing Structural Zone (KSZ) reveals a section through the Archean lower crustal granoblastic gneisses. Our new paleomagnetic data largely agree with previous work but we show that interpretations vary according to the choices of statistical, demagnetization and field-correction techniques. First, where the orientation distribution of characteristic remanence directions on the sphere is not symmetrically circular, the commonly used statistical model is invalid [Fisher, R.A., Proc. R. Soc. A217 (1953) 295]. Any tendency to form an elliptical distribution indicates that the sample is drawn from a Bingham-type population [Bingham, C., 1964. Distributions on the sphere and on the projective plane. PhD thesis, Yale University]. Fisher and Bingham statistics produce different confidence estimates from the same data and the traditionally defined mean vector may differ from the maximum eigenvector of an orthorhombic Bingham distribution. It seems prudent to apply both models wherever a non-Fisher population is suspected and that may be appropriate in any tectonized rocks. Non-Fisher populations require larger sample sizes so that focussing on individual sites may not be the most effective policy in tectonized rocks. More dispersed sampling across tectonic structures may be more productive. Second, from the same specimens, mean vectors isolated by thermal and alternating field (AF) demagnetization differ. Which treatment gives more meaningful results is difficult to decipher, especially in metamorphic rocks where the history of the magnetic minerals is not easily related to the ages of tectonic and petrological events. In this study, thermal demagnetization gave lower inclinations for paleomagnetic vectors and thus more distant paleopoles. Third, of more parochial significance, tilt corrections may be unnecessary in the KSZ because magnetic fabrics and thrust ramp are constant in orientation to the depth at which they level off, at approximately 15-km depth. With Archean geothermal gradients, primary remanences were blocked after the foliation was tilted to rise on the thrust ramp. Therefore, the rocks were probably magnetized in their present orientation; tilting largely or entirely predates magnetization.

  17. Statistical and population genetics issues of two Hungarian datasets from the aspect of DNA evidence interpretation.

    PubMed

    Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma

    2015-11-01

    When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies. PMID:26036185

  18. Uses and Misuses of Student Evaluations of Teaching: The Interpretation of Differences in Teaching Evaluation Means Irrespective of Statistical Information

    ERIC Educational Resources Information Center

    Boysen, Guy A.

    2015-01-01

    Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…

  19. Uses and Misuses of Student Evaluations of Teaching: The Interpretation of Differences in Teaching Evaluation Means Irrespective of Statistical Information

    ERIC Educational Resources Information Center

    Boysen, Guy A.

    2015-01-01

    Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…

  20. Dose impact in radiographic lung injury following lung SBRT: Statistical analysis and geometric interpretation

    SciTech Connect

    Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan

    2014-03-15

    Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.

  1. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    ERIC Educational Resources Information Center

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  2. Statistics for the time-dependent failure of Kevlar-49/epoxy composites: micromechanical modeling and data interpretation

    SciTech Connect

    Phoenix, S.L.; Wu, E.M.

    1983-03-01

    This paper presents some new data on the strength and stress-rupture of Kevlar-49 fibers, fiber/epoxy strands and pressure vessels, and consolidated data obtained at LLNL over the past 10 years. This data are interpreted by using recent theoretical results from a micromechanical model of the statistical failure process, thereby gaining understanding of the roles of the epoxy matrix and ultraviolet radiation on long term lifetime.

  3. Boyle temperature as a point of ideal gas in gentile statistics and its economic interpretation

    NASA Astrophysics Data System (ADS)

    Maslov, V. P.; Maslova, T. V.

    2014-07-01

    Boyle temperature is interpreted as the temperature at which the formation of dimers becomes impossible. To Irving Fisher's correspondence principle we assign two more quantities: the number of degrees of freedom, and credit. We determine the danger level of the mass of money M when the mutual trust between economic agents begins to fall.

  4. Feature combination networks for the interpretation of statistical machine learning models: application to Ames mutagenicity

    PubMed Central

    2014-01-01

    Background A new algorithm has been developed to enable the interpretation of black box models. The developed algorithm is agnostic to learning algorithm and open to all structural based descriptors such as fragments, keys and hashed fingerprints. The algorithm has provided meaningful interpretation of Ames mutagenicity predictions from both random forest and support vector machine models built on a variety of structural fingerprints. A fragmentation algorithm is utilised to investigate the model’s behaviour on specific substructures present in the query. An output is formulated summarising causes of activation and deactivation. The algorithm is able to identify multiple causes of activation or deactivation in addition to identifying localised deactivations where the prediction for the query is active overall. No loss in performance is seen as there is no change in the prediction; the interpretation is produced directly on the model’s behaviour for the specific query. Results Models have been built using multiple learning algorithms including support vector machine and random forest. The models were built on public Ames mutagenicity data and a variety of fingerprint descriptors were used. These models produced a good performance in both internal and external validation with accuracies around 82%. The models were used to evaluate the interpretation algorithm. Interpretation was revealed that links closely with understood mechanisms for Ames mutagenicity. Conclusion This methodology allows for a greater utilisation of the predictions made by black box models and can expedite further study based on the output for a (quantitative) structure activity model. Additionally the algorithm could be utilised for chemical dataset investigation and knowledge extraction/human SAR development. PMID:24661325

  5. Statistics Translated: A Step-by-Step Guide to Analyzing and Interpreting Data

    ERIC Educational Resources Information Center

    Terrell, Steven R.

    2012-01-01

    Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying…

  6. Statistics Translated: A Step-by-Step Guide to Analyzing and Interpreting Data

    ERIC Educational Resources Information Center

    Terrell, Steven R.

    2012-01-01

    Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying…

  7. Statistical interpretation of the correlation between intermediate mass fragment multiplicity and transverse energy

    NASA Astrophysics Data System (ADS)

    Phair, L.; Beaulieu, L.; Moretto, L. G.; Wozniak, G. J.; Bowman, D. R.; Carlin, N.; Celano, L.; Colonna, N.; Dinius, J. D.; Ferrero, A.; Gelbke, C. K.; Glasmacher, T.; Gramegna, F.; Handzy, D. O.; Hsi, W. C.; Huang, M. J.; Iori, I.; Kim, Y. D.; Lisa, M. A.; Lynch, W. G.; Margagliotti, G. V.; Mastinu, P. F.; Milazzo, P. M.; Montoya, C. P.; Moroni, A.; Peaslee, G. F.; Rui, R.; Schwarz, C.; Tsang, M. B.; Tso, K.; Vannini, G.; Zhu, F.

    1999-11-01

    Multifragment emission following 129Xe+197Au collisions at 30A, 40A, 50A, and 60A MeV has been studied with multidetector systems covering nearly 4? in solid angle. The correlations of both the intermediate mass fragment and light charged particle multiplicities with the transverse energy are explored. A comparison is made with results from a similar system 136Xe+209Bi at 28A MeV. The experimental trends are compared to statistical model predictions.

  8. Statistical Model for the Interpretation of Evidence for Bio-Signatures Simulated in virtual Mars Samples.

    NASA Astrophysics Data System (ADS)

    Mani, Peter; Heuer, Markus; Hofmann, Beda A.; Milliken, Kitty L.; West, Julia M.

    This paper evaluates a mathematical model of bio-signature search processes on Mars samples returned to Earth and studied inside a Mars Sample Return Facility (MSRF). Asimple porosity model for a returned Mars sample, based on initial observations on Mars meteorites, has been stochastically simulated and the data analysed in a computer study. The resulting false positive, true negative and false negative values - as a typical output of the simulations - was statistically analysed. The results were used in Bayes’ statistics to correct the a-priori probability of presence of bio-signature and the resulting posteriori probability was used in turn to improve the initial assumption of the value of extra-terrestrial presence for life forms in Mars material. Such an iterative algorithm can lead to a better estimate of the positive predictive value for life on Mars and therefore, together with Poisson statistics for a null result, it should be possible to bound the probability for the presence of extra-terrestrial bio-signatures to an upper level.

  9. Inhalation experiments with mixtures of hydrocarbons. Experimental design, statistics and interpretation of kinetics and possible interactions.

    PubMed

    Eide, I; Zahlsen, K

    1996-01-01

    The paper describes experimental and statistical methods for toxicokinetic evaluation of mixtures in inhalation experiments. Synthetic mixtures of three C9 n-paraffinic, naphthenic and aromatic hydrocarbons (n-nonane, trimethylcyclohexane and trimethylbenzene, respectively) were studied in the rat after inhalation for 12h. The hydrocarbons were mixed according to principles for statistical experimental design using mixture design at four vapour levels (75, 150, 300 and 450 ppm) to support an empirical model with linear, interaction and quadratic terms (Taylor polynome). Immediately after exposure, concentrations of hydrocarbons were measured by head space gas chromatography in blood, brain, liver, kidneys and perirenal fat. Multivariate data analysis and modelling were performed with PLS (projections to latent structures). The best models were obtained after removing all interaction terms, suggesting that there were no interactions between the hydrocarbons with respect to absorption and distribution. Uptake of paraffins and particularly aromatics is best described by quadratic models, whereas the uptake of the naphthenic hydrocarbons is nearly linear. All models are good, with high correlation (r2) and prediction properties (Q2), the latter after cross validation. The concentrations of aromates in blood were high compared to the other hydrocarbons. At concentrations below 250 ppm, the naphthene reached higher concentrations in the brain compared to the paraffin and the aromate. Statistical experimental design, multivariate data analysis and modelling have proved useful for the evaluation of synthetic mixtures. The principles may also be used in the design of liquid mixtures, which may be evaporated partially or completely. PMID:8740533

  10. Logical, epistemological and statistical aspects of nature-nurture data interpretation.

    PubMed

    Kempthorne, O

    1978-03-01

    In this paper the nature of the reasoning processes applied to the nature-nurture question is discussed in general and with particular reference to mental and behavioral traits. The nature of data analysis and analysis of variance is discussed. Necessarily, the nature of causation is considered. The notion that mere data analysis can establish "real" causation is attacked. Logic of quantitative genetic theory is reviewed briefly. The idea that heritability is meaningful in the human mental and behavioral arena is attacked. The conclusion is that the heredity-IQ controversy has been a "tale full of sound and fury, signifying nothing". To suppose that one can establish effects of an intervention process when it does not occur in the data is plainly ludicrous. Mere observational studies can easily lead to stupidities, and it is suggested that this has happened in the heredity-IQ arena. The idea that there are racial-genetic differences in mental abilities and behavioral traits of humans is, at best, no more than idle speculation. PMID:637918

  11. Statistical Approaches to Interpretation of Local, Regional, and National Highway-Runoff and Urban-Stormwater Data

    USGS Publications Warehouse

    Tasker, Gary D.; Granato, Gregory E.

    2000-01-01

    Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.

  12. Statistical and Methodological Considerations for the Interpretation of Intranasal Oxytocin Studies.

    PubMed

    Walum, Hasse; Waldman, Irwin D; Young, Larry J

    2016-02-01

    Over the last decade, oxytocin (OT) has received focus in numerous studies associating intranasal administration of this peptide with various aspects of human social behavior. These studies in humans are inspired by animal research, especially in rodents, showing that central manipulations of the OT system affect behavioral phenotypes related to social cognition, including parental behavior, social bonding, and individual recognition. Taken together, these studies in humans appear to provide compelling, but sometimes bewildering, evidence for the role of OT in influencing a vast array of complex social cognitive processes in humans. In this article, we investigate to what extent the human intranasal OT literature lends support to the hypothesis that intranasal OT consistently influences a wide spectrum of social behavior in humans. We do this by considering statistical features of studies within this field, including factors like statistical power, prestudy odds, and bias. Our conclusion is that intranasal OT studies are generally underpowered and that there is a high probability that most of the published intranasal OT findings do not represent true effects. Thus, the remarkable reports that intranasal OT influences a large number of human social behaviors should be viewed with healthy skepticism, and we make recommendations to improve the reliability of human OT studies in the future. PMID:26210057

  13. Statistic-mathematical interpretation of some assessment parameters of the grassland ecosystem according to soil characteristics

    NASA Astrophysics Data System (ADS)

    Samfira, Ionel; Boldea, Marius; Popescu, Cosmin

    2012-09-01

    Significant parameters of permanent grasslands are represented by the pastoral value and Shannon and Simpson biodiversity indices. The dynamics of these parameters has been studied in several plant associations in Banat Plain, Romania. From the point of view of their typology, these permanent grasslands belong to the steppe area, series Festuca pseudovina, type Festuca pseudovina-Achilea millefolium, subtype Lolium perenne. The methods used for the purpose of this research included plant cover analysis (double meter method, calculation of Shannon and Simpson indices), and statistical methods of regression and correlation. The results show that, in the permanent grasslands in the plain region, when the pastoral value is average to low, the level of interspecific biodiversity is on the increase.

  14. Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays

    NASA Astrophysics Data System (ADS)

    Sibatov, R. T.

    2011-08-01

    A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.

  15. A statistical approach to the interpretation of aliphatic hydrocarbon distributions in marine sediments

    USGS Publications Warehouse

    Rapp, J.B.

    1991-01-01

    Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.

  16. Interpreting CMB anisotropy observations: Trying to tell the truth with statistics

    NASA Astrophysics Data System (ADS)

    Gawiser, Eric

    2001-10-01

    A conflict has been reported between the baryon density inferred from deuterium observations and that found from recent CMB observations by BOOMERanG and MAXIMA. Despite the flurry of papers that attempt to resolve this conflict by adding new physics to the early universe, we will show that it can instead be resolved via a more careful usage of statistics. Indeed, the Bayesian analyses that produce this conflict are by their nature poorly suited for drawing this type of conclusion. A properly defined frequentist analysis can address this question directly and appears not to find a conflict. Finally, a conservative accounting of systematic uncertainties in measuring the deuterium abundance could reduce what is nominally a 3? conflict to 1?. .

  17. A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements

    SciTech Connect

    Keita Yoshioka; Pinan Dawkrajai; Analis A. Romero; Ding Zhu; A. D. Hill; Larry W. Lake

    2007-01-15

    With the recent development of temperature measurement systems, continuous temperature profiles can be obtained with high precision. Small temperature changes can be detected by modern temperature measuring instruments such as fiber optic distributed temperature sensor (DTS) in intelligent completions and will potentially aid the diagnosis of downhole flow conditions. In vertical wells, since elevational geothermal changes make the wellbore temperature sensitive to the amount and the type of fluids produced, temperature logs can be used successfully to diagnose the downhole flow conditions. However, geothermal temperature changes along the wellbore being small for horizontal wells, interpretations of a temperature log become difficult. The primary temperature differences for each phase (oil, water, and gas) are caused by frictional effects. Therefore, in developing a thermal model for horizontal wellbore, subtle temperature changes must be accounted for. In this project, we have rigorously derived governing equations for a producing horizontal wellbore and developed a prediction model of the temperature and pressure by coupling the wellbore and reservoir equations. Also, we applied Ramey's model (1962) to the build section and used an energy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases at varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section. With the prediction models developed, we present inversion studies of synthetic and field examples. These results are essential to identify water or gas entry, to guide flow control devices in intelligent completions, and to decide if reservoir stimulation is needed in particular horizontal sections. This study will complete and validate these inversion studies.

  18. A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements

    SciTech Connect

    Pinan Dawkrajai; Keita Yoshioka; Analis A. Romero; Ding Zhu; A.D. Hill; Larry W. Lake

    2005-10-01

    This project is motivated by the increasing use of distributed temperature sensors for real-time monitoring of complex wells (horizontal, multilateral and multi-branching wells) to infer the profiles of oil, gas, and water entry. Measured information can be used to interpret flow profiles along the wellbore including junction and build section. In this second project year, we have completed a forward model to predict temperature and pressure profiles in complex wells. As a comprehensive temperature model, we have developed an analytical reservoir flow model which takes into account Joule-Thomson effects in the near well vicinity and multiphase non-isothermal producing wellbore model, and couples those models accounting mass and heat transfer between them. For further inferences such as water coning or gas evaporation, we will need a numerical non-isothermal reservoir simulator, and unlike existing (thermal recovery, geothermal) simulators, it should capture subtle temperature change occurring in a normal production. We will show the results from the analytical coupled model (analytical reservoir solution coupled with numerical multi-segment well model) to infer the anomalous temperature or pressure profiles under various conditions, and the preliminary results from the numerical coupled reservoir model which solves full matrix including wellbore grids. We applied Ramey's model to the build section and used an enthalpy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section.

  19. A COMPREHENSIVE STATISTICALLY-BASED METHOD TO INTERPRET REAL-TIME FLOWING MEASUREMENTS

    SciTech Connect

    Pinan Dawkrajai; Analis A. Romero; Keita Yoshioka; Ding Zhu; A.D. Hill; Larry W. Lake

    2004-10-01

    In this project, we are developing new methods for interpreting measurements in complex wells (horizontal, multilateral and multi-branching wells) to determine the profiles of oil, gas, and water entry. These methods are needed to take full advantage of ''smart'' well instrumentation, a technology that is rapidly evolving to provide the ability to continuously and permanently monitor downhole temperature, pressure, volumetric flow rate, and perhaps other fluid flow properties at many locations along a wellbore; and hence, to control and optimize well performance. In this first year, we have made considerable progress in the development of the forward model of temperature and pressure behavior in complex wells. In this period, we have progressed on three major parts of the forward problem of predicting the temperature and pressure behavior in complex wells. These three parts are the temperature and pressure behaviors in the reservoir near the wellbore, in the wellbore or laterals in the producing intervals, and in the build sections connecting the laterals, respectively. Many models exist to predict pressure behavior in reservoirs and wells, but these are almost always isothermal models. To predict temperature behavior we derived general mass, momentum, and energy balance equations for these parts of the complex well system. Analytical solutions for the reservoir and wellbore parts for certain special conditions show the magnitude of thermal effects that could occur. Our preliminary sensitivity analyses show that thermal effects caused by near-wellbore reservoir flow can cause temperature changes that are measurable with smart well technology. This is encouraging for the further development of the inverse model.

  20. Adsorption of ethanol onto activated carbon: Modeling and consequent interpretations based on statistical physics treatment

    NASA Astrophysics Data System (ADS)

    Bouzid, Mohamed; Sellaoui, Lotfi; Khalfaoui, Mohamed; Belmabrouk, Hafedh; Lamine, Abdelmottaleb Ben

    2016-02-01

    In this work, we studied the adsorption of ethanol on three types of activated carbon, namely parent Maxsorb III and two chemically modified activated carbons (H2-Maxsorb III and KOH-H2-Maxsorb III). This investigation has been conducted on the basis of the grand canonical formalism in statistical physics and on simplified assumptions. This led to three parameter equations describing the adsorption of ethanol onto the three types of activated carbon. There was a good correlation between experimental data and results obtained by the new proposed equation. The parameters characterizing the adsorption isotherm were the number of adsorbed molecules (s) per site n, the density of the receptor sites per unit mass of the adsorbent Nm, and the energetic parameter p1/2. They were estimated for the studied systems by a non linear least square regression. The results show that the ethanol molecules were adsorbed in perpendicular (or non parallel) position to the adsorbent surface. The magnitude of the calculated adsorption energies reveals that ethanol is physisorbed onto activated carbon. Both van der Waals and hydrogen interactions were involved in the adsorption process. The calculated values of the specific surface AS, proved that the three types of activated carbon have a highly microporous surface.

  1. Hysteresis model and statistical interpretation of energy losses in non-oriented steels

    NASA Astrophysics Data System (ADS)

    Mănescu (Păltânea), Veronica; Păltânea, Gheorghe; Gavrilă, Horia

    2016-04-01

    In this paper the hysteresis energy losses in two non-oriented industrial steels (M400-65A and M800-65A) were determined, by means of an efficient classical Preisach model, which is based on the Pescetti-Biorci method for the identification of the Preisach density. The excess and the total energy losses were also determined, using a statistical framework, based on magnetic object theory. The hysteresis energy losses, in a non-oriented steel alloy, depend on the peak magnetic polarization and they can be computed using a Preisach model, due to the fact that in these materials there is a direct link between the elementary rectangular loops and the discontinuous character of the magnetization process (Barkhausen jumps). To determine the Preisach density it was necessary to measure the normal magnetization curve and the saturation hysteresis cycle. A system of equations was deduced and the Preisach density was calculated for a magnetic polarization of 1.5 T; then the hysteresis cycle was reconstructed. Using the same pattern for the Preisach distribution, it was computed the hysteresis cycle for 1 T. The classical losses were calculated using a well known formula and the excess energy losses were determined by means of the magnetic object theory. The total energy losses were mathematically reconstructed and compared with those, measured experimentally.

  2. Chemical and statistical interpretation of sized aerosol particles collected at an urban site in Thessaloniki, Greece.

    PubMed

    Tsitouridou, Roxani; Papazova, Petia; Simeonova, Pavlina; Simeonov, Vasil

    2013-01-01

    The size distribution of aerosol particles (PM0.015-PM18) in relation to their soluble inorganic species and total water soluble organic compounds (WSOC) was investigated at an urban site of Thessaloniki, Northern Greece. The sampling period was from February to July 2007. The determined compounds were compared with mass concentrations of the PM fractions for nano (N: 0.015 < Dp < 0.06), ultrafine (UFP: 0.015 < Dp < 0.125), fine (FP: 0.015 < Dp < 2.0) and coarse particles (CP: 2.0 < Dp < 8.0) in order to perform mass closure of the water soluble content for the respective fractions. Electrolytes were the dominant species in all fractions (24-27%), followed by WSOC (16-23%). The water soluble inorganic and organic content was found to account for 53% of the nanoparticle, 48% of the ultrafine particle, 45% of the fine particle and 44% of the coarse particle mass. Correlations between the analyzed species were performed and the effect of local and long-range transported emissions was examined by wind direction and backward air mass trajectories. Multivariate statistical analysis (cluster analysis and principal components analysis) of the collected data was performed in order to reveal the specific data structure. Possible sources of air pollution were identified and an attempt is made to find patterns of similarity between the different sized aerosols and the seasons of monitoring. It was proven that several major latent factors are responsible for the data structure despite the size of the aerosols - mineral (soil) dust, sea sprays, secondary emissions, combustion sources and industrial impact. The seasonal separation proved to be not very specific. PMID:24007436

  3. An Easily Constructed Trigonal Prism Model.

    ERIC Educational Resources Information Center

    Yamana, Shukichi

    1984-01-01

    A model of a trigonal prism which is useful for teaching stereochemistry (especially of the neodymium enneahydrate ion), can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  4. Improved solderless connector is easily disconnected

    NASA Technical Reports Server (NTRS)

    1965-01-01

    Compression type solderless connector is easily disconnected and reassembled and resists vibration. The connector, which uses a tapered split sleeve that is tightened by a nut into a mating bug, is used in place of standard solder lugs and to connect unsolderable wire.

  5. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality

    PubMed Central

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M

    2008-01-01

    Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. Conclusion The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. Method We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit. PMID:18992163

  6. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  7. Acquire CoOmmodities Easily Card

    Energy Science and Technology Software Center (ESTSC)

    1998-05-29

    Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for helpmore » with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.« less

  8. Acquire CoOmmodities Easily Card

    SciTech Connect

    Soler, E. E.

    1998-05-29

    Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.

  9. ACECARD. Acquire CoOmmodities Easily Card

    SciTech Connect

    Soler, E.E.

    1996-09-01

    Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.

  10. Statistical interpretation of the rate of carbon isotope changes at the onset of the Paleocene-Eocene Thermal Maximum

    NASA Astrophysics Data System (ADS)

    Urban, N. M.; Bralower, T. J.; Keller, K.; Kump, L. R.

    2009-12-01

    Abrupt warming during the Paleocene-Eocene Thermal Maximum (PETM) 55 Ma event was driven by major input of greenhouse gas. Primary evidence for this is a sharp 4-5 per mil decrease in carbon isotope values at the onset of the event. Interpretation of the dynamics of the warming and identification of the ultimate source of the carbon relies on precise estimates of the rate of carbon addition at the onset of the event. A step toward this goal is to determine the rate of change of carbon isotope values in the major PETM sections. Although terrestrial and continental shelf PETM records are undoubtedly more stratigraphically expanded, deep-sea records provide more precise time control. Key deep-sea sections have been studied at high levels of resolution. However, their stratigraphy is complicated by condensed sections or possible unconformities at the base of the PETM. As a result, many PETM records are characterized by sizeable variation in sample spacing in terms of depth and age. We have developed a Bayesian inversion technique that accounts for the effects of variable sample spacing and uncertainties about the statistical variability of the isotope excursion and the termination of the PETM interval. We apply this technique using extraterrestrial helium and orbital age models to deep sea PETM carbon isotope data (Ocean Drilling Program Site 690, Maud Rise, Southern Ocean and Site 1262, Walvis Ridge).We use this technique to place probabilistic limits on the rate of carbon isotope change during the PETM. We compare these rates with modern rates of carbon isotopic change.

  11. Compositionality and Statistics in Adjective Acquisition: 4-Year-Olds Interpret "Tall" and "Short" Based on the Size Distributions of Novel Noun Referents

    ERIC Educational Resources Information Center

    Barner, David; Snedeker, Jesse

    2008-01-01

    Four experiments investigated 4-year-olds' understanding of adjective-noun compositionality and their sensitivity to statistics when interpreting scalar adjectives. In Experiments 1 and 2, children selected "tall" and "short" items from 9 novel objects called "pimwits" (1-9 in. in height) or from this array plus 4 taller or shorter distractor…

  12. Compositionality and Statistics in Adjective Acquisition: 4-Year-Olds Interpret "Tall" and "Short" Based on the Size Distributions of Novel Noun Referents

    ERIC Educational Resources Information Center

    Barner, David; Snedeker, Jesse

    2008-01-01

    Four experiments investigated 4-year-olds' understanding of adjective-noun compositionality and their sensitivity to statistics when interpreting scalar adjectives. In Experiments 1 and 2, children selected "tall" and "short" items from 9 novel objects called "pimwits" (1-9 in. in height) or from this array plus 4 taller or shorter distractor…

  13. Easily available enzymes as natural retting agents.

    PubMed

    Antonov, Viktor; Marek, Jan; Bjelkova, Marie; Smirous, Prokop; Fischer, Holger

    2007-03-01

    Easily available commercial enzymes currently have great potential in bast fibre processing and can be modified for different end uses. There are several new technologies using enzymes that are able to modify fibre parameters, achieve requested properties, improve processing results and are more beneficial to the ecology in the area of bast fibre processing and fabrics finishing. Enzymatic methods for retting of flax, "cottonisation" of bast fibres, hemp separation, and processing of flax rovings before wet spinning, etc., fall into this group of new technologies. Such enzymatic biotechnologies can provide benefits in textile, composite, reinforced plastic and other technical applications. Laboratory, pilot and industrial scale results and experiences have demonstrated the ability of selected enzymes to decompose interfibre-bonding layers based on pectin, lignin and hemicelluloses. Texazym SER spray is able to increase flax long fibre yields by more than 40%. Other enzymes in combination with mild mechanical treatment can replace aggressive and energy-intensive processing like Laroche "cottonisation". Texazym SCW and DLG pretreatments of flax rovings are presented. PMID:17309044

  14. Quantum of area {Delta}A=8{pi}l{sub P}{sup 2} and a statistical interpretation of black hole entropy

    SciTech Connect

    Ropotenko, Kostiantyn

    2010-08-15

    In contrast to alternative values, the quantum of area {Delta}A=8{pi}l{sub P}{sup 2} does not follow from the usual statistical interpretation of black hole entropy; on the contrary, a statistical interpretation follows from it. This interpretation is based on the two concepts: nonadditivity of black hole entropy and Landau quantization. Using nonadditivity a microcanonical distribution for a black hole is found and it is shown that the statistical weight of a black hole should be proportional to its area. By analogy with conventional Landau quantization, it is shown that quantization of a black hole is nothing but the Landau quantization. The Landau levels of a black hole and their degeneracy are found. The degree of degeneracy is equal to the number of ways to distribute a patch of area 8{pi}l{sub P}{sup 2} over the horizon. Taking into account these results, it is argued that the black hole entropy should be of the form S{sub bh}=2{pi}{center_dot}{Delta}{Gamma}, where the number of microstates is {Delta}{Gamma}=A/8{pi}l{sub P}{sup 2}. The nature of the degrees of freedom responsible for black hole entropy is elucidated. The applications of the new interpretation are presented. The effect of noncommuting coordinates is discussed.

  15. A Note on the Calculation and Interpretation of the Delta-p Statistic for Categorical Independent Variables

    ERIC Educational Resources Information Center

    Cruce, Ty M.

    2009-01-01

    This methodological note illustrates how a commonly used calculation of the Delta-p statistic is inappropriate for categorical independent variables, and this note provides users of logistic regression with a revised calculation of the Delta-p statistic that is more meaningful when studying the differences in the predicted probability of an…

  16. On Item Mappings and Statistical Rules for Selecting Binary Items for Criterion-Referenced Interpretation and Bookmark Standard Settings.

    ERIC Educational Resources Information Center

    Huyhn, Huynh

    Item mappings are widely used in educational assessment for applications such as test administration (through test form assembly and computer assisted testing) and for criterion-referenced (CR) interpretation of test scores or scale anchoring. Item mappings are also used to construct ordered item booklets in the CTB/McGraw Hill Bookmark standard…

  17. The use of easily debondable orthodontic adhesives with ceramic brackets.

    PubMed

    Ryu, Chiyako; Namura, Yasuhiro; Tsuruoka, Takashi; Hama, Tomohiko; Kaji, Kaori; Shimizu, Noriyoshi

    2011-01-01

    We experimentally produced an easily debondable orthodontic adhesive (EDA) containing heat-expandable microcapsules. The purpose of this in vitro study was to evaluate the best debondable condition when EDA was used for ceramic brackets. Shear bond strengths were measured before and after heating and were compared statistically. Temperatures of the bracket base and pulp wall were also examined during heating. Bond strengths of EDA containing 30 wt% and 40 wt% heat-expandable microcapsules were 13.4 and 12.9 MPa, respectively and decreased significantly to 3.8 and 3.7 MPa, respectively, after heating. The temperature of the pulp wall increased 1.8-3.6°C after heating, less than that required to induce pulp damage. Based on the results, we conclude that heating for 8 s during debonding of ceramic brackets bonded using EDA containing 40 wt% heat-expandable microcapsules is the most effective and safest method for the enamel and pulp. PMID:21946484

  18. Statistical factor analysis technique for characterizing basalt through interpreting nuclear and electrical well logging data (case study from Southern Syria).

    PubMed

    Asfahani, Jamal

    2014-02-01

    Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted. PMID:24296157

  19. Collegiate Enrollments in the U.S., 1979-80. Statistics, Interpretations, and Trends in 4-Year and Related Institutions.

    ERIC Educational Resources Information Center

    Mickler, J. Ernest

    This 60th annual report on collegiate enrollments in the United States is based on data received from 1,635 four-year institutions in the U.S., Puerto Rico, and the U.S. Territories. General notes, survey methodology notes, and a summary of findings are presented. Detailed statistical charts present institutional data on men and women students and…

  20. Analysis of the procedures used to evaluate suicide crime scenes in Brazil: a statistical approach to interpret reports.

    PubMed

    Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira

    2014-08-01

    This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. PMID:25066170

  1. Skew-Laplace and Cell-Size Distribution in Microbial Axenic Cultures: Statistical Assessment and Biological Interpretation

    PubMed Central

    Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S.; Vives-Rego, Josep

    2010-01-01

    We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754

  2. Skew-laplace and cell-size distribution in microbial axenic cultures: statistical assessment and biological interpretation.

    PubMed

    Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S; Vives-Rego, Josep

    2010-01-01

    We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754

  3. Compositionality and statistics in adjective acquisition: 4-year-olds interpret tall and short based on the size distributions of novel noun referents.

    PubMed

    Barner, David; Snedeker, Jesse

    2008-01-01

    Four experiments investigated 4-year-olds' understanding of adjective-noun compositionality and their sensitivity to statistics when interpreting scalar adjectives. In Experiments 1 and 2, children selected tall and short items from 9 novel objects called pimwits (1-9 in. in height) or from this array plus 4 taller or shorter distractor objects of the same kind. Changing the height distributions of the sets shifted children's tall and short judgments. However, when distractors differed in name and surface features from targets, in Experiment 3, judgments did not shift. In Experiment 4, dissimilar distractors did affect judgments when they received the same name as targets. It is concluded that 4-year-olds deploy a compositional semantics that is sensitive to statistics and mediated by linguistic labels. PMID:18489415

  4. Statistical physics modeling of hydrogen desorption from LaNi4.75Fe0.25: Stereographic and energetic interpretations

    NASA Astrophysics Data System (ADS)

    Wjihi, Sarra; Dhaou, Houcine; Yahia, Manel Ben; Knani, Salah; Jemni, Abdelmajid; Lamine, Abdelmottaleb Ben

    2015-12-01

    Statistical physics treatment is used to study the desorption of hydrogen on LaNi4.75Fe0.25, in order to obtain new physicochemical interpretations at the molecular level. Experimental desorption isotherms of hydrogen on LaNi4.75Fe0.25 are fitted at three temperatures (293 K, 303 K and 313 K), using a monolayer desorption model. Six parameters of the model are fitted, namely the number of molecules per site n? and n?, the receptor site densities N?M and N?M, and the energetic parameters P? and P?. The behaviors of these parameters are discussed in relationship with desorption process. A dynamic study of the ? and ? phases in the desorption process was then carried out. Finally, the different thermodynamical potential functions are derived by statistical physics calculations from our adopted model.

  5. Making the right conclusions based on wrong results and small sample sizes: interpretation of statistical tests in ecotoxicology.

    PubMed

    Wang, Magnus; Riffel, Michael

    2011-05-01

    In environmental risk assessments statistical tests are a standard tool to evaluate the significance of effects by pesticides. While it has rarely been assessed how likely it is to detect effects given a specific sample size, it was never analysed how reliable results are if the test preconditions, particularly of parametric tests, are not fulfilled or how likely it is to detect deviations from these preconditions. Therefore, we analyse the performance of a parametric and a non-parametric test using Monte Carlo simulation, focussing on typical data used in ecotoxicological risk assessments. We show that none of the data distributions are normal and that for typical sample sizes of N<20 it is very unlikely to detect deviations from normality. Non-parametric tests performed markedly better than parametric tests, except when data were in fact normally distributed. We finally discuss the impact of using different tests on pesticide risk assessments. PMID:21035855

  6. Chemical data and statistical interpretations for rocks and ores from the Ranger uranium mine, Northern Territory, Australia

    USGS Publications Warehouse

    Nash, J. Thomas; Frishman, David

    1983-01-01

    Analytical results for 61 elements in 370 samples from the Ranger Mine area are reported. Most of the rocks come from drill core in the Ranger No. 1 and Ranger No. 3 deposits, but 20 samples are from unmineralized drill core more than 1 km from ore. Statistical tests show that the elements Mg, Fe, F, Be, Co, Li, Ni, Pb, Sc, Th, Ti, V, CI, As, Br, Au, Ce, Dy, La Sc, Eu, Tb, Yb, and Tb have positive association with uranium, and Si, Ca, Na, K, Sr, Ba, Ce, and Cs have negative association. For most lithologic subsets Mg, Fe, Li, Cr, Ni, Pb, V, Y, Sm, Sc, Eu, and Yb are significantly enriched in ore-bearing rocks, whereas Ca, Na, K, Sr, Ba, Mn, Ce, and Cs are significantly depleted. These results are consistent with petrographic observations on altered rocks. Lithogeochemistry can aid exploration, but for these rocks requires methods that are expensive and not amenable to routine use.

  7. Hydrochemical and multivariate statistical interpretations of spatial controls of nitrate concentrations in a shallow alluvial aquifer around oxbow lakes (Osong area, central Korea).

    PubMed

    Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo

    2009-07-21

    Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean=35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets. PMID:19524319

  8. Hydrochemical and multivariate statistical interpretations of spatial controls of nitrate concentrations in a shallow alluvial aquifer around oxbow lakes (Osong area, central Korea)

    NASA Astrophysics Data System (ADS)

    Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo

    2009-07-01

    Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean = 35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets.

  9. Proper interpretation of chronic toxicity studies and their statistics: A critique of "Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example".

    PubMed

    Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol

    2015-09-01

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. PMID:25261588

  10. On the likelihood of post-perovskite near the core-mantle boundary: A statistical interpretation of seismic observations

    NASA Astrophysics Data System (ADS)

    Cobden, Laura; Mosca, Ilaria; Trampert, Jeannot; Ritsema, Jeroen

    2012-11-01

    Recent experimental studies indicate that perovskite, the dominant lower mantle mineral, undergoes a phase change to post-perovskite at high pressures. However, it has been unclear whether this transition occurs within the Earth's mantle, due to uncertainties in both the thermochemical state of the lowermost mantle and the pressure-temperature conditions of the phase boundary. In this study we compare the relative fit to global seismic data of mantle models which do and do not contain post-perovskite, following a statistical approach. Our data comprise more than 10,000 Pdiff and Sdiff travel-times, global in coverage, from which we extract the global distributions of dln VS and dln VP near the core-mantle boundary (CMB). These distributions are sensitive to the underlying lateral variations in mineralogy and temperature even after seismic uncertainties are taken into account, and are ideally suited for investigating the likelihood of the presence of post-perovskite. A post-perovskite-bearing CMB region provides a significantly closer fit to the seismic data than a post-perovskite-free CMB region on both a global and regional scale. These results complement previous local seismic reflection studies, which have shown a consistency between seismic observations and the physical properties of post-perovskite inside the deep Earth.

  11. The Role of Experimental and Statistical Uncertainty in Interpretation of Immersion Freezing: A Case for Classical Nucleation Theory

    NASA Astrophysics Data System (ADS)

    Alpert, P. A.; Knopf, D. A.

    2014-12-01

    Ice nucleation is the initial step in forming mixed-phase and cirrus clouds, and is well established as an important influence on global climate. Laboratory studies investigate at which cloud relevant conditions of temperature (T) and relative humidity (RH) ice nucleation occurs and as a result, numerous fundamentally different ice nucleation descriptions have been proposed for implementation in cloud and climate models. We introduce a new immersion freezing model based on first principles of statistics to simulate individual droplet freezing requiring only three experimental parameters, which are the total number of droplets, the uncertainty of applied surface area per droplet, and the heterogeneous ice nucleation rate coefficient, Jhet, as a function as a function of T and water activity (aw), where in equilibrium RH=aw. Previous studies reporting frozen fractions (f) or Jhet for a droplet population are described by our model for mineral, inorganic, organic, and biological ice nuclei and different techniques including cold stage, oil-immersion, continuous flow diffusion chamber, flow tube, cloud chamber, acoustic levitation and wind levitation experiments. Taking advantage of the physically based parameterization of Jhet by Knopf and Alpert (Faraday Discuss., 165, 513-534, 2013), our model can predict immersion freezing for the entire atmospherically relevant range of T, RH, particle surface area, and time scales, even for conditions unattainable in a laboratory setting. Lastly, we present a rigorous experimental uncertainty analysis using a Monte Carlo method of laboratory derived Jhet and f. These results imply that classical nucleation theory is universal for immersion freezing. In combination with a aw based description of Jhet, this approach allows for a physically based and computational little demanding implementation in climate and cloud models.

  12. Biological drugs for the treatment of rheumatoid arthritis by the subcutaneous route: interpreting efficacy data to assess statistical equivalence

    PubMed Central

    Fadda, Valeria; Maratea, Dario; Trippoli, Sabrina; Gatto, Roberta; De Rosa, Mauro; Marinai, Claudio

    2014-01-01

    Background: No equivalence analysis has yet been conducted on the effectiveness of biologics in rheumatoid arthritis. Equivalence testing has a specific scientific interest, but can also be useful for deciding whether acquisition tenders are feasible for the pharmacological agents being compared. Methods: Our search covered the literature up to August 2014. Our methodology was a combination of standard pairwise meta-analysis, Bayesian network meta-analysis and equivalence testing. The agents examined for their potential equivalence were etanercept, adalimumab, golimumab, certolizumab, and tocilizumab, each in combination with methotrexate (MTX). The reference treatment was MTX monotherapy. The endpoint was ACR50 achievement at 12 months. Odds ratio was the outcome measure. The equivalence margins were established by analyzing the statistical power data of the trials. Results: Our search identified seven randomized controlled trials (2846 patients). No study was retrieved for tocilizumab, and so only four biologics were evaluable. The equivalence range was set at odds ratio from 0.56 to 1.78. There were 10 head-to-head comparisons (4 direct, 6 indirect). Bayesian network meta-analysis estimated the odds ratio (with 90% credible intervals) for each of these comparisons. Between-trial heterogeneity was marked. According to our results, all credible intervals of the 10 comparisons were wide and none of them satisfied the equivalence criterion. A superiority finding was confirmed for the treatment with MTX plus adalimumab or certolizumab in comparison with MTX monotherapy, but not for the other two biologics. Conclusion: Our results indicate that these four biologics improved the rates of ACR50 achievement, but there was an evident between-study heterogeneity. The head-to-head indirect comparisons between individual biologics showed no significant difference, but failed to demonstrate the proof of no difference (i.e. equivalence). This body of evidence presently precludes any option of undertaking competitive tenderings for the procurement of these agents. PMID:25435923

  13. Multivariate Statistical Analysis as a Supplementary Tool for Interpretation of Variations in Salivary Cortisol Level in Women with Major Depressive Disorder

    PubMed Central

    Dziurkowska, Ewelina; Wesolowski, Marek

    2015-01-01

    Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods. PMID:26380376

  14. Pulsar statistics and their interpretations

    NASA Technical Reports Server (NTRS)

    Arnett, W. D.; Lerche, I.

    1981-01-01

    It is shown that a lack of knowledge concerning interstellar electron density, the true spatial distribution of pulsars, the radio luminosity source distribution of pulsars, the real ages and real aging rates of pulsars, the beaming factor (and other unknown factors causing the known sample of about 350 pulsars to be incomplete to an unknown degree) is sufficient to cause a minimum uncertainty of a factor of 20 in any attempt to determine pulsar birth or death rates in the Galaxy. It is suggested that this uncertainty must impact on suggestions that the pulsar rates can be used to constrain possible scenarios for neutron star formation and stellar evolution in general.

  15. Facies control on sandstone composition (and influence of statistical methods on interpretations) in the John Henry Member, Straight Cliffs Formation, Southern Utah, USA

    NASA Astrophysics Data System (ADS)

    Allen, Jessica L.; Johnson, Cari L.

    2010-10-01

    The Upper Cretaceous John Henry Member of the Straight Cliffs Formation preserves regressive shoreface and channel facies, and transgressive lagoonal tidal inlet facies that suggest distinct modal sandstone compositions. Detrital modes from six sandstone facies (upper shoreface, lower shoreface, deflected mouth bars, fluvial channels, tidal inlets and washover fans), and their spatial and temporal variations, provide additional data regarding the depositional environments and setting of the John Henry Member. Three statistical methods were utilized: univariate standard deviation and confidence interval and multivariate logratio transformations to assess the compositional difference between facies using estimation of means. Both univariate statistical methods have some inherent problems. Standard deviation does not account for sample size, while the confidence interval method incorporates a t-test to account for only some of the uncertainty of small sample size. The results of the univariate methods differ slightly and both indicate statistically distinct sandstone compositions based on facies. Multivariate statistics are much more robust and suggest similar trends, yet display different compositional means and have inconclusive confidence fields as they incorporate all of the uncertainty associated with small sampling sizes. In general, sandstone compositions become more quartz-rich and compositionally mature as sediment is transported from proximal fluvial environments to distal upper shoreface environments. This trend reflects the degree of preferential reworking or winnowing of unstable grains prior to lithification. More complicated relationships are observed as facies distributions shift through space and time. Sandstone compositions support a deflected wave-dominated deltaic interpretation for the John Henry Member in this area. Sandstone compositions become more lithic-rich from north to south, corresponding to the facies distribution of upper shoreface and deflected mouth bars, respectively. Mouth bars occur at the intersection of the fluvial channel and the coastline where sand is deflected via longshore drift to the southern portion of the field area. Upper shoreface facies located further away from mouth bar facies are heavily reworked by wave processes and contain more mature quartz-rich modal compositions. Additionally, high feldspathic and lithic concentrations in tidal inlet facies suggest that this facies was sourced closer to mouth bars rather than updrift upper shoreface sediments. A shift in sandstone composition occurs at the end of the second transgressive-regressive cycle of the John Henry Member throughout the Rogers Canyon area. This compositional change is concurrent with a relatively large basinward shift in facies, which suggests that it reflects the transition from regressive to transgressive facies associated with the relative sea level fall rather than a change in the main sediment source area.

  16. Easily constructed mini-sextant demonstrates optical principles

    NASA Astrophysics Data System (ADS)

    Nenninger, Garet G.

    2000-04-01

    An easily constructed optical instrument for measuring the angle between the Sun and the horizon is described. The miniature sextant relies on multiple reflections to produce multiple images of the sun at fixed angles away from the true Sun.

  17. An Easily Constructed Model of a Square Antiprism.

    ERIC Educational Resources Information Center

    Yamana, Shukichi

    1984-01-01

    A model of a square antiprism which is necessary for teaching stereochemistry (for example, of the octafluorotantalate ion) can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  18. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    PubMed

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization both in 2-D and 1-D was undertaken. Results suggest that the iCFD model developed for the SST through the proposed methodology is able to predict solid distribution with high accuracy - taking a reasonable computational effort - when compared to multi-dimensional numerical experiments, under a wide range of flow and design conditions. iCFD tools could play a crucial role in reliably predicting systems' performance under normal and shock events. PMID:26248321

  19. Modular thermoelectric cell is easily packaged in various arrays

    NASA Technical Reports Server (NTRS)

    Epstein, J.

    1965-01-01

    Modular thermoelectric cells are easily packaged in various arrays to form power supplies and have desirable voltage and current output characteristics. The cells employ two pairs of thermoelectric elements, each pair being connected in parallel between two sets of aluminum plates. They can be used as solar energy conversion devices.

  20. Easily disassembled electrical connector for high voltage, high frequency connections

    DOEpatents

    Milner, Joseph R. (Livermore, CA)

    1994-01-01

    An easily accessible electrical connector capable of rapid assembly and disassembly wherein a wide metal conductor sheet may be evenly contacted over the entire width of the conductor sheet by opposing surfaces on the connector which provide an even clamping pressure against opposite surfaces of the metal conductor sheet using a single threaded actuating screw.

  1. Easily disassembled electrical connector for high voltage, high frequency connections

    DOEpatents

    Milner, J.R.

    1994-05-10

    An easily accessible electrical connector capable of rapid assembly and disassembly is described wherein a wide metal conductor sheet may be evenly contacted over the entire width of the conductor sheet by opposing surfaces on the connector which provide an even clamping pressure against opposite surfaces of the metal conductor sheet using a single threaded actuating screw. 13 figures.

  2. Combining data visualization and statistical approaches for interpreting measurements and meta-data: Integrating heatmaps, variable clustering, and mixed regression models

    EPA Science Inventory

    The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...

  3. Between Teacher & Parent: Helping the Child Who Cries Easily

    ERIC Educational Resources Information Center

    Brodkin, Adele M.

    2004-01-01

    Parents need to remember that crying is the first method of communication for children younger than 5 or 6. It is their way of getting attention. While it isn't easy for new parents to interpret their baby's cries, most learn to distinguish the "I am hungry--feed me" cry from the "My tummy hurts" or the "I am just fussy and bored" cry. This…

  4. Interpreting the Evidence for Effective Interventions to Increase the Academic Performance of Students with ADHD: Relevance of the Statistical Significance Controversy

    ERIC Educational Resources Information Center

    Harrison, Judith; Thompson, Bruce; Vannest, Kimberly J.

    2009-01-01

    This article reviews the literature on interventions targeting the academic performance of students with attention-deficit/hyperactivity disorder (ADHD) and does so within the context of the statistical significance testing controversy. Both the arguments for and against null hypothesis statistical significance tests are reviewed. Recent standards…

  5. Plasmonic Films Can Easily Be Better: Rules and Recipes

    PubMed Central

    2015-01-01

    High-quality materials are critical for advances in plasmonics, especially as researchers now investigate quantum effects at the limit of single surface plasmons or exploit ultraviolet- or CMOS-compatible metals such as aluminum or copper. Unfortunately, due to inexperience with deposition methods, many plasmonics researchers deposit metals under the wrong conditions, severely limiting performance unnecessarily. This is then compounded as others follow their published procedures. In this perspective, we describe simple rules collected from the surface-science literature that allow high-quality plasmonic films of aluminum, copper, gold, and silver to be easily deposited with commonly available equipment (a thermal evaporator). Recipes are also provided so that films with optimal optical properties can be routinely obtained. PMID:25950012

  6. Metview and VAPOR: Exploring ECMWF forecasts easily in four dimensions

    NASA Astrophysics Data System (ADS)

    Siemen, Stephan; Kertesz, Sandor; Carver, Glenn

    2014-05-01

    The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member states and co-operating states with forecasts in the medium time range of up to 15 days as well as other forcasts and analysis. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast. Users can choose to explore ECMWF's forecasts from the web or through visualisation tools installed locally or at ECMWF. ECMWF also develops in co-operation with INPE, Brazil, the Metview meteorological workstation and batch system. Metview enables users to easily analyse and visualise forecasts, and is routinely used by scientists and forecasters at ECMWF and other institutions. While Metview offers high quality visualisation in two-dimensional plots and animations, it uses external tools to visualise data in four dimensions. VAPOR is the Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers. VAPOR provides an interactive 3D visualisation environment that runs on most UNIX and Windows systems equipped with modern 3D graphics cards. VAPOR development is led by the National Center for Atmospheric Research's Scientific Computing Division in collaboration with U.C. Davis and Ohio State University. In this paper we will give an overview of how users, with Metview and access to ECMWF's archive, can visualise forecast data in four dimensions within VAPOR. The process of preparing the data in Metview is the key step and described in detail. The benefits to researchers are highlighted with a case study analysing a given weather scenario.

  7. An easily constructed, very inexpensive, solar cell transmissiometer (SCT)

    SciTech Connect

    Knowles, S.C.; Wells, J.T.

    1998-01-01

    Suspended sediment concentration (SSC), one of the standard measures of water quality in aquatic systems, is of widespread interest because of its role in particulate flux and light attenuation. The conventional method of sampling and filtration to determine SSC is rather slow and cumbersome, and does not allow resolution of high-frequency variations. In contrast, the beam transmissometer has been used as a convenient way to obtain time-series estimates of SSC in environments ranging from the deep ocean to the continental shelf and estuaries. In situ observation of suspended particles in fluvial, estuarine, lagoonal, and inner-shelf environments has in fact revealed that most of the volume of material in suspension exists as aggregates (> 50--100 {micro}m diameter) of smaller components and that aggregate properties may change on time scales of only a few minutes and length scales of less than one meter. Here, the authors describe a very inexpensive, easily constructed solar cell transmissometer (SCT) that has been developed for use with an in situ suspended-sediment photography system. Conventional optical SSC sensors can provide excellent predictive capability when deployed during conditions of relatively uniform aggregate characteristics. However, this system may provide better results than commercially available systems because it disaggregates suspended material prior to measurement of light attenuation, thereby reducing the effects of aggregation and changes in aggregate characteristics, which can occur over very short temporal and spatial scales. This paper describes the solar cell transmissometer and gives results of laboratory calibration, design, construction, and field testing alongside an optical backscatter sensor (OBS).

  8. THE PROBLEM OF CLASSIFYING MEMBERS OF A POPULATION ON A CONTINUOUS SCALE--STATISTICAL MODELS FOR THE EVALUATIONS AND INTERPRETATION OF EDUCATIONAL CRITERIA, PART 1.

    ERIC Educational Resources Information Center

    BARNETT, F.C.; SAW, J.G.

    A WORKING MODEL CAPABLE OF RANKING INDIVIDUALS IN A RANDOM SAMPLE FROM A MULTIVARIATE POPULATION BY SOME CRITERION OF INTEREST WAS DEVELOPED. THE MULTIPLE CORRELATION COEFFICIENT OF RANKS WITH MEASURED VARIATES AS A STATISTIC IN TESTING WHETHER RANKS ARE ASSOCIATED WITH MEASUREMENTS WAS EMPLOYED AND DUBBED "QUASI-RANK MULTIPLE CORRELATION…

  9. The allelic correlation structure of Gainj- and Kalam-speaking people. I. The estimation and interpretation of Wright's F-statistics.

    PubMed

    Long, J C

    1986-03-01

    The internal patterning of allelic correlations in the Gainj and Kalam swidden horticulturalists of highland Papua New Guinea is examined within the context of Sewall Wright's F-statistic model. A multiallelic extension of the model is given first, and multivariate variance-component estimators for the parameters are suggested. Then, it is shown that the expectation of the F-statistic set depends on the age structure of the population and that knowledge of the population and sample age structure is critical for meaningful analysis. The array of F-statistics estimated jointly over five polymorphic enzyme loci reveals the following features of Gainj and Kalam population structure: (1) significant departures from panmictic expectations and (2) characteristics of a continuously distributed breeding population, rather than those expected for populations subdivided into demes with discrete boundaries. Finally, the F-statistics estimated for the Gainj and Kalam are briefly compared to estimates obtained from other tribal populations. It is seen that the level of differentiation observed in the Gainj and Kalam is only about one-third that observed in South American swidden horticulturalists. Consequently, some conventional wisdom regarding the interrelationship of socioecological settings and genetic structures may require reevaluation. PMID:3957006

  10. The Allelic Correlation Structure of Gainj-and Kalam-Speaking People. I. the Estimation and Interpretation of Wright's F-Statistics

    PubMed Central

    Long, Jeffrey C.

    1986-01-01

    The internal patterning of allelic correlations in the Gainj and Kalam swidden horticulturalists of highland Papua New Guinea is examined within the context of Sewall Wright's F-statistic model. A multiallelic extension of the model is given first, and multivariate variance-component estimators for the parameters are suggested. Then, it is shown that the expectation of the F-statistic set depends on the age structure of the population and that knowledge of the population and sample age structure is critical for meaningful analysis. The array of F-statistics estimated jointly over five polymorphic enzyme loci reveals the following features of Gainj and Kalam population structure: (1) significant departures from panmictic expectations and (2) characteristics of a continuously distributed breeding population, rather than those expected for populations subdivided into demes with discrete boundaries. Finally, the F-statistics estimated for the Gainj and Kalam are briefly compared to estimates obtained from other tribal populations. It is seen that the level of differentiation observed in the Gainj and Kalam is only about one-third that observed in South American swidden horticulturalists. Consequently, some conventional wisdom regarding the interrelationship of socioecological settings and genetic structures may require reevaluation. PMID:3957006

  11. Making large amounts of meteorological plots easily accessible to users

    NASA Astrophysics Data System (ADS)

    Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin

    2015-04-01

    The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member organisations with forecasts in the medium time range of 3 to 15 days, and some longer-range forecasts for up to a year ahead, with varying degrees of detail. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast, where some specific processing and visualisation are applied to extract information. Every day, thousands of raw data are being pushed to the ECMWF's interactive web charts application called ecCharts, and thousands of products are processed and pushed to ECMWF's institutional web site ecCharts provides a highly interactive application to display and manipulate recent numerical forecasts to forecasters in national weather services and ECMWF's commercial customers. With ecCharts forecasters are able to explore ECMWF's medium-range forecasts in far greater detail than has previously been possible on the web, and this as soon as the forecast becomes available. All ecCharts's products are also available through a machine-to-machine web map service based on the OGC Web Map Service (WMS) standard. ECMWF institutional web site provides access to a large number of graphical products. It was entirely redesigned last year. It now shares the same infrastructure as ECMWF's ecCharts, and can benefit of some ecCharts functionalities, for example the dashboard. The dashboard initially developed for ecCharts allows users to organise their own collection of products depending on their work flow, and is being further developed. In its first implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.

  12. Statistics Clinic

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  13. Enhancing the interpretation of statistical P values in toxicology studies: implementation of linear mixed models (LMMs) and standardized effect sizes (SESs).

    PubMed

    Schmidt, Kerstin; Schmidtke, Jörg; Kohl, Christian; Wilhelm, Ralf; Schiemann, Joachim; van der Voet, Hilko; Steinberg, Pablo

    2016-03-01

    In this paper, we compare the traditional ANOVA approach to analysing data from 90-day toxicity studies with a more modern LMM approach, and we investigate the use of standardized effect sizes. The LMM approach is used to analyse weight or feed consumption data. When compared to the week-by-week ANOVA with multiple test results per week, this approach results in only one statement on differences in weight development between groups. Standardized effect sizes are calculated for the endpoints: weight, relative organ weights, haematology and clinical biochemistry. The endpoints are standardized, allowing different endpoints of the same study to be compared and providing an overall picture of group differences at a glance. Furthermore, in terms of standardized effect sizes, statistical significance and biological relevance are displayed simultaneously in a graph. PMID:25724152

  14. Easily Installable Wireless Behavioral Monitoring System with Electric Field Sensor for Ordinary Houses

    PubMed Central

    Tsukamoto, S; Hoshino, H; Tamura, T

    2008-01-01

    This paper describes an indoor behavioral monitoring system for improving the quality of life in ordinary houses. It employs a device that uses weak radio waves for transmitting the obtained data and it is designed such that it can be installed by a user without requiring any technical knowledge or extra constructions. This study focuses on determining the usage statistics of home electric appliances by using an electromagnetic field sensor as a detection device. The usage of the home appliances is determined by measuring the electromagnetic field that can be observed in an area near the appliance. It is assumed that these usage statistics could provide information regarding the indoor behavior of a subject. Since the sensor is not direction sensitive and does not require precise positioning and wiring, it can be easily installed in ordinary houses by the end users. For evaluating the practicability of the sensor unit, several simple tests have been performed. The results indicate that the proposed system could be useful for collecting the usage statistics of home appliances. PMID:19415135

  15. Easily installable wireless behavioral monitoring system with electric field sensor for ordinary houses.

    PubMed

    Tsukamoto, S; Hoshino, H; Tamura, T

    2008-01-01

    This paper describes an indoor behavioral monitoring system for improving the quality of life in ordinary houses. It employs a device that uses weak radio waves for transmitting the obtained data and it is designed such that it can be installed by a user without requiring any technical knowledge or extra constructions. This study focuses on determining the usage statistics of home electric appliances by using an electromagnetic field sensor as a detection device. The usage of the home appliances is determined by measuring the electromagnetic field that can be observed in an area near the appliance. It is assumed that these usage statistics could provide information regarding the indoor behavior of a subject. Since the sensor is not direction sensitive and does not require precise positioning and wiring, it can be easily installed in ordinary houses by the end users. For evaluating the practicability of the sensor unit, several simple tests have been performed. The results indicate that the proposed system could be useful for collecting the usage statistics of home appliances. PMID:19415135

  16. Statistical treatment and preliminary interpretation of chemical data from a uranium deposit in the northeast part of the Church Rock area, Gallup mining district, New Mexico

    USGS Publications Warehouse

    Spirakis, C.S.; Pierson, C.T.; Santos, E.S.; Fishman, N.S.

    1983-01-01

    Statistical treatment of analytical data from 106 samples of uranium-mineralized and unmineralized or weakly mineralized rocks of the Morrison Formation from the northeastern part of the Church Rock area of the Grants uranium region indicates that along with uranium, the deposits in the northeast Church Rock area are enriched in barium, sulfur, sodium, vanadium and equivalent uranium. Selenium and molybdenum are sporadically enriched in the deposits and calcium, manganese, strontium, and yttrium are depleted. Unlike the primary deposits of the San Juan Basin, the deposits in the northeast part of the Church Rock area contain little organic carbon and several elements that are characteristically enriched in the primary deposits are not enriched or are enriched to a much lesser degree in the Church Rock deposits. The suite of elements associated with the deposits in the northeast part of the Church Rock area is also different from the suite of elements associated with the redistributed deposits in the Ambrosia Lake district. This suggests that the genesis of the Church Rock deposits is different, at least in part, from the genesis of the primary deposits of the San Juan Basin or the redistributed deposits at Ambrosia Lake.

  17. Palaeomagnetic analysis on pottery as indicator of the pyroclastic flow deposits temperature: new data and statistical interpretation from the Minoan eruption of Santorini, Greece

    NASA Astrophysics Data System (ADS)

    Tema, E.; Zanella, E.; Pavón-Carrasco, F. J.; Kondopoulou, D.; Pavlides, S.

    2015-10-01

    We present the results of palaeomagnetic analysis on Late Bronge Age pottery from Santorini carried out in order to estimate the thermal effect of the Minoan eruption on the pre-Minoan habitation level. A total of 170 specimens from 108 ceramic fragments have been studied. The ceramics were collected from the surface of the pre-Minoan palaeosol at six different sites, including also samples from the Akrotiri archaeological site. The deposition temperatures of the first pyroclastic products have been estimated by the maximum overlap of the re-heating temperature intervals given by the individual fragments at site level. A new statistical elaboration of the temperature data has also been proposed, calculating at 95 per cent of probability the re-heating temperatures at each site. The obtained results show that the precursor tephra layer and the first pumice fall of the eruption were hot enough to re-heat the underlying ceramics at temperatures 160-230 °C in the non-inhabited sites while the temperatures recorded inside the Akrotiri village are slightly lower, varying from 130 to 200 °C. The decrease of the temperatures registered in the human settlements suggests that there was some interaction between the buildings and the pumice fallout deposits while probably the buildings debris layer caused by the preceding and syn-eruption earthquakes has also contributed to the decrease of the recorded re-heating temperatures.

  18. Statistical Analysis and Interpretation of Building Characterization, Indoor Environmental Quality Monitoring and Energy Usage Data from Office Buildings and Classrooms in the United States

    SciTech Connect

    Linda Stetzenbach; Lauren Nemnich; Davor Novosel

    2009-08-31

    Three independent tasks had been performed (Stetzenbach 2008, Stetzenbach 2008b, Stetzenbach 2009) to measure a variety of parameters in normative buildings across the United States. For each of these tasks 10 buildings were selected as normative indoor environments. Task 1 focused on office buildings, Task 13 focused on public schools, and Task 0606 focused on high performance buildings. To perform this task it was necessary to restructure the database for the Indoor Environmental Quality (IEQ) data and the Sound measurement as several issues were identified and resolved prior to and during the transfer of these data sets into SPSS. During overview discussions with the statistician utilized in this task it was determined that because the selection of indoor zones (1-6) was independently selected within each task; zones were not related by location across tasks. Therefore, no comparison would be valid across zones for the 30 buildings so the by location (zone) data were limited to three analysis sets of the buildings within each task. In addition, differences in collection procedures for lighting were used in Task 0606 as compared to Tasks 01 & 13 to improve sample collection. Therefore, these data sets could not be merged and compared so effects by-day data were run separately for Task 0606 and only Task 01 & 13 data were merged. Results of the statistical analysis of the IEQ parameters show statistically significant differences were found among days and zones for all tasks, although no differences were found by-day for Draft Rate data from Task 0606 (p>0.05). Thursday measurements of IEQ parameters were significantly different from Tuesday, and most Wednesday measures for all variables of Tasks 1 & 13. Data for all three days appeared to vary for Operative Temperature, whereas only Tuesday and Thursday differed for Draft Rate 1m. Although no Draft Rate measures within Task 0606 were found to significantly differ by-day, Temperature measurements for Tuesday and Thursday showed variation. Moreover, Wednesday measurements of Relative Humidity within Task 0606 varied significantly from either Tuesday or Thursday. The majority of differences in IEQ measurements by-zone were highly significant (p<0.001), with the exception of Relative Humidity in some buildings. When all task data were combined (30 buildings) neither the airborne culturable fungi nor the airborne non-culturable spore data differed in the concentrations found at any indoor location in terms of day of collection. However, the concentrations of surface-associated fungi varied among the day of collection. Specifically, there was a lower concentration of mold on Tuesday than on Wednesday, for all tasks combined. As expected, variation was found in the concentrations of both airborne culturable fungi and airborne non-culturable fungal spores between indoor zones (1-6) and the outdoor zone (zone 0). No variation was found among the indoor zones of office buildings for Task 1 in the concentrations of airborne culturable fungi. However, airborne non-culturable spores did vary among zones in one building in Task 1 and variation was noted between zones in surface-associated fungi. Due to the lack of multiple lighting measurements for Tasks 13 and 0606, by-day comparisons were only performed for Task 1. No statistical differences were observed in lighting with respect to the day of collection. There was a wide range of variability by-zone among seven of the office buildings. Although few differences were found for the brightest illumination of the worksurface (IllumWkSfcBrtst) and the darkest illumination of the worksurface (IllumWkSfcDrkst) in Task 1, there was considerable variation for these variables in Task 13 and Task 0606 (p < 0.001). Other variables that differed by-zone in Task 13 include CombCCT and AmbCCT1 for S03, S07, and S08. Additionally, AmbChromX1, CombChromY, and CombChromX varied by-zone for school buildings S02, S04, and S05, respectively. Although all tasks demonstrated significant differences in sound measurements by zone, some of the buildings within each task did not appear to differ in sound quality. Hence, post-hoc tests were not appropriate and individual zones were not compared for these buildings. It is interesting to note that sound measurements in some buildings were widely varied with most zone comparisons and other buildings varied between only a few zones.

  19. Landslides triggered by the 12 January 2010 Port-au-Prince, Haiti, Mw = 7.0 earthquake: visual interpretation, inventory compiling, and spatial distribution statistical analysis

    NASA Astrophysics Data System (ADS)

    Xu, C.; Shyu, J. B. H.; Xu, X.

    2014-07-01

    The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw= 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and the thicknesses of their erosion with topographic, geologic, and seismic parameters. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolution satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various environmental parameters. These parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several previous publications. Therefore, we carried out comparisons of inventories of landslides triggered by the Haiti earthquake with other published results and proposed possible reasons for any differences. We suggest that the empirical functions between earthquake magnitude and co-seismic landslides need to be updated on the basis of the abundant and more complete co-seismic landslide inventories recently available.

  20. Landslides triggered by the 12 January 2010 Mw 7.0 Port-au-Prince, Haiti, earthquake: visual interpretation, inventory compiling and spatial distribution statistical analysis

    NASA Astrophysics Data System (ADS)

    Xu, C.; Shyu, J. B. H.; Xu, X.-W.

    2014-02-01

    The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and their erosion thicknesses with topographic factors, seismic parameters, and their distance from roads. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolutions satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various landslide controlling parameters. These controlling parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several previous publications. Therefore, we carried out comparisons of inventories of landslides triggered by the Haiti earthquake with other published results and proposed possible reasons of any differences. We suggest that the empirical functions between earthquake magnitude and co-seismic landslides need to update on the basis of the abundant and more complete co-seismic landslide inventories recently available.

  1. Long-Term Amorphous Drug Stability Predictions Using Easily Calculated, Predicted, and Measured Parameters.

    PubMed

    Nurzyńska, Katarzyna; Booth, Jonathan; Roberts, Clive J; McCabe, James; Dryden, Ian; Fischer, Peter M

    2015-09-01

    The purpose of this study was to develop a predictive model of the amorphous stability of drugs with particular relevance for poorly water-soluble compounds. Twenty-five representative neutral poorly soluble compounds with a diverse range of physicochemical properties and chemical structures were systematically selected from an extensive library of marketed drug products. The physical stability of the amorphous form, measured over a 6 month period by the onset of crystallization of amorphous films prepared by melting and quench-cooling, was assessed using polarized light microscopy. The data were used as a response variable in a statistical model with calculated/predicted or measured molecular, thermodynamic, and kinetic parameters as explanatory variables. Several multiple linear regression models were derived, with varying balance between calculated/predicted and measured parameters. It was shown that inclusion of measured parameters significantly improves the predictive ability of the model. The best model demonstrated a prediction accuracy of 82% and included the following as parameters: melting and glass transition temperatures, enthalpy of fusion, configurational free energy, relaxation time, number of hydrogen bond donors, lipophilicity, and the ratio of carbon to heteroatoms. Good predictions were also obtained with a simpler model, which was comprised of easily acquired quantities: molecular weight and enthalpy of fusion. Statistical models are proposed to predict long-term amorphous drug stability. The models include readily accessible parameters, which are potentially the key factors influencing amorphous stability. The derived models can support faster decision making in drug formulation development. PMID:26236939

  2. Localized Smart-Interpretation

    NASA Astrophysics Data System (ADS)

    Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom

    2014-05-01

    The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f(d,m) successfully has been inferred, we are able to simulate how the geological expert would perform an interpretation given some external information m, through f(d|m). We will demonstrate this method applied on geological interpretation and densely sampled airborne electromagnetic data. In short, our goal is to build a statistical model describing how a geological expert performs geological interpretation given some geophysical data. We then wish to use this statistical model to perform semi automatic interpretation, everywhere where such geophysical data exist, in a manner consistent with the choices made by a geological expert. Benefits of such a statistical model are that 1. it provides a quantification of how a geological expert performs interpretation based on available diverse data 2. all available geophysical information can be used 3. it allows much faster interpretation of large data sets.

  3. Interpreting Bones.

    ERIC Educational Resources Information Center

    Weymouth, Patricia P.

    1986-01-01

    Describes an activity which introduces students to the nature and challenges of paleoanthropology. In the exercise, students identify diagrammed bones and make interpretations about the creature. Presents questions and tasks employed in the lesson. (ML)

  4. Interpretive Experiments

    ERIC Educational Resources Information Center

    DeHaan, Frank, Ed.

    1977-01-01

    Describes an interpretative experiment involving the application of symmetry and temperature-dependent proton and fluorine nmr spectroscopy to the solution of structural and kinetic problems in coordination chemistry. (MLH)

  5. An Easily Accessible Web-Based Minimization Random Allocation System for Clinical Trials

    PubMed Central

    Xiao, Lan; Huang, Qiwen; Yank, Veronica

    2013-01-01

    Background Minimization as an adaptive allocation technique has been recommended in the literature for use in randomized clinical trials. However, it remains uncommonly used due in part to a lack of easily accessible implementation tools. Objective To provide clinical trialists with a robust, flexible, and readily accessible tool for implementing covariate-adaptive biased-coin randomization. Methods We developed a Web-based random allocation system, MinimRan, that applies Pocock–Simon (for trials with 2 or more arms) and 2-way (currently limited to 2-arm trials) minimization methods for trials using only categorical prognostic factors or the symmetric Kullback–Leibler divergence minimization method for trials (currently limited to 2-arm trials) using continuous prognostic factors with or without categorical factors, in covariate-adaptive biased-coin randomization. Results In this paper, we describe the system’s essential statistical and computer programming features and provide as an example the randomization results generated by it in a recently completed trial. The system can be used in single- and double-blind trials as well as single-center and multicenter trials. Conclusions We expect the system to facilitate the translation of the 3 validated random allocation methods into broad, efficient clinical research practice. PMID:23872035

  6. Interpreting Metonymy.

    ERIC Educational Resources Information Center

    Pankhurst, Anne

    1994-01-01

    This paper examines some of the problems associated with interpreting metonymy, a figure of speech in which an attribute or commonly associated feature is used to name or designate something. After defining metonymy and outlining the principles of metonymy, the paper explains the differences between metonymy, synecdoche, and metaphor. It is…

  7. Performing Interpretation

    ERIC Educational Resources Information Center

    Kothe, Elsa Lenz; Berard, Marie-France

    2013-01-01

    Utilizing a/r/tographic methodology to interrogate interpretive acts in museums, multiple areas of inquiry are raised in this paper, including: which knowledge is assigned the greatest value when preparing a gallery talk; what lies outside of disciplinary knowledge; how invitations to participate invite and disinvite in the same gesture; and what…

  8. Interpreting Evidence.

    ERIC Educational Resources Information Center

    Munsart, Craig A.

    1993-01-01

    Presents an activity that allows students to experience the type of discovery process that paleontologists necessarily followed during the early dinosaur explorations. Students are read parts of a story taken from the "American Journal of Science" and interpret the evidence leading to the discovery of Triceratops and Stegosaurus. (PR)

  9. CAinterprTools: An R package to help interpreting Correspondence Analysis' results

    NASA Astrophysics Data System (ADS)

    Alberti, Gianmarco

    2015-09-01

    Correspondence Analysis (CA) is a statistical exploratory technique frequently used in many research fields to graphically visualize the structure of contingency tables. Many programs, both commercial and free, perform CA but none of them as yet provides a visual aid to the interpretation of the results. The 'CAinterprTools' package, designed to be used in the free R statistical environment, aims at filling that gap. A novel-to-medium R user has been considered as target. 15 commands enable to easily obtain charts that help (and are relevant to) the interpretation of the CA's results, freeing the user from the need to inspect and scrutinize tabular CA outputs, and to look up values and statistics on which further calculations are necessary. The package also implements tests to assess the significance of the input table's total inertia and individual dimensions.

  10. Interpretive Medicine

    PubMed Central

    Reeve, Joanne

    2010-01-01

    Patient-centredness is a core value of general practice; it is defined as the interpersonal processes that support the holistic care of individuals. To date, efforts to demonstrate their relationship to patient outcomes have been disappointing, whilst some studies suggest values may be more rhetoric than reality. Contextual issues influence the quality of patient-centred consultations, impacting on outcomes. The legitimate use of knowledge, or evidence, is a defining aspect of modern practice, and has implications for patient-centredness. Based on a critical review of the literature, on my own empirical research, and on reflections from my clinical practice, I critique current models of the use of knowledge in supporting individualised care. Evidence-Based Medicine (EBM), and its implementation within health policy as Scientific Bureaucratic Medicine (SBM), define best evidence in terms of an epistemological emphasis on scientific knowledge over clinical experience. It provides objective knowledge of disease, including quantitative estimates of the certainty of that knowledge. Whilst arguably appropriate for secondary care, involving episodic care of selected populations referred in for specialist diagnosis and treatment of disease, application to general practice can be questioned given the complex, dynamic and uncertain nature of much of the illness that is treated. I propose that general practice is better described by a model of Interpretive Medicine (IM): the critical, thoughtful, professional use of an appropriate range of knowledges in the dynamic, shared exploration and interpretation of individual illness experience, in order to support the creative capacity of individuals in maintaining their daily lives. Whilst the generation of interpreted knowledge is an essential part of daily general practice, the profession does not have an adequate framework by which this activity can be externally judged to have been done well. Drawing on theory related to the recognition of quality in interpretation and knowledge generation within the qualitative research field, I propose a framework by which to evaluate the quality of knowledge generated within generalist, interpretive clinical practice. I describe three priorities for research in developing this model further, which will strengthen and preserve core elements of the discipline of general practice, and thus promote and support the health needs of the public. PMID:21805819

  11. SLAR image interpretation keys for geographic analysis

    NASA Technical Reports Server (NTRS)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  12. CT Colonography: Pitfalls in Interpretation

    PubMed Central

    Pickhardt, Perry J.; Kim, David H.

    2012-01-01

    Synopsis As with any radiologic imaging test, there are a number of potential interpretive pitfalls at CT colonography (CTC) that need to be recognized and handled appropriately. Perhaps the single most important step in learning to avoid most of these diagnostic traps is simply to be aware of their existence. With a little experience, most of these potential pitfalls will be easily recognized. This review will systematically cover the key pitfalls confronting the radiologist at CTC interpretation, primarily dividing them into those related to technique and those related to underlying anatomy. Tips and pointers for how to effectively handle these potential pitfalls are included. PMID:23182508

  13. An Easily Constructed Model of Twin Octahedrons Having a Common Line.

    ERIC Educational Resources Information Center

    Yamana, Shukichi; Kawaguchi, Makoto

    1984-01-01

    A model of twin octahedrons having a common line which is useful for teaching stereochemistry (especially that of complex ions) can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)

  14. LED champing: statistically blessed?

    PubMed

    Wang, Zhuo

    2015-06-10

    LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions. PMID:26192863

  15. Making On-Line Science Course Materials Easily Translatable and Accessible Worldwide: Challenges and Solutions

    ERIC Educational Resources Information Center

    Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.

    2012-01-01

    The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET…

  16. INTERPRETING INDICATORS OF RANGELAND HEALTH, VERSION 4

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Land managers are in need of an assessment tool that provides a preliminary evaluation of rangeland health. Interpreting Indicators of Rangeland Health, Version 4 is the second published version of a protocol that uses 17 easily observed indicators summarized as three rangeland health attributes (s...

  17. Summary and interpretive synthesis

    SciTech Connect

    1995-05-01

    This chapter summarizes the major advances made through our integrated geological studies of the Lisburne Group in northern Alaska. The depositional history of the Lisburne Group is discussed in a framework of depositional sequence stratigraphy. Although individual parasequences (small-scale carbonate cycles) of the Wahoo Limestone cannot be correlated with certainty, parasequence sets can be interpreted as different systems tracts within the large-scale depositional sequences, providing insights on the paleoenvironments, paleogeography and platform geometry. Conodont biostratigraphy precisely established the position of the Mississippian-Pennsylvanian boundary within an important reference section, where established foraminiferal biostratigraphy is inconsistent with respect to conodont-based time-rock boundaries. However, existing Carboniferous conodont zonations are not readily applicable because most zonal indicators are absent, so a local zonation scheme was developed. Diagenetic studies of the Lisburne Group recognized nineteen subaerial exposure surfaces and developed a cement stratigraphy that includes: early cements associated with subaerial exposure surfaces in the Lisburne Group; cements associated with the sub-Permian unconformity; and later burial cements. Subaerial exposure surfaces in the Alapah Limestone are easily explained, being associated with peritidal environments at the boundaries of Sequence A. The Lisburne exposed in ANWR is generally tightly cemented and supermature, but could still be a good reservoir target in the adjacent subsurface of ANWR given the appropriate diagenetic, deformational and thermal history. Our ongoing research on the Lisburne Group will hopefully provide additional insights in future publications.

  18. Synthesis, Characterization, to application of water soluble and easily removable cationic pressure sensitive adhesives

    SciTech Connect

    Institute of Paper Science Technology

    2004-01-30

    In recent years, the world has expressed an increasing interest in the recycling of waste paper to supplement the use of virgin fiber as a way to protect the environment. Statistics show that major countries are increasing their use of recycled paper. For example, in 1991 to 1996, the U.S. increased its recovered paper utilization rate from 31% to 39%, Germany went from 50% to 60%, the UK went from 60% to 70%, France increased from 46% to 49%, and China went from 32% to 35% [1]. As recycled fiber levels and water system closures both increase, recycled product quality will need to improve in order for recycled products to compete with products made from virgin fiber [2]. The use of recycled fiber has introduced an increasing level of metal, plastic, and adhesive contamination into the papermaking process which has added to the complexity of the already overwhelming task of providing a uniform and clean recycle furnish. The most harmful of these contaminates is a mixture of adhesives and polymeric substances that are commonly known as stickies. Stickies, which enter the mill with the pulp furnish, are not easily removed from the repulper and become more difficult the further down the system they get. This can be detrimental to the final product quality. Stickies are hydrophobic, tacky, polymeric materials that are introduced into the papermaking system from a mixture of recycled fiber sources. Properties of stickies are very similar to the fibers used in papermaking, viz. size, density, hydrophobicity, and electrokinetic charge. This reduces the probability of their removal by conventional separation processes, such as screening and cleaning, which are based on such properties. Also, their physical and chemical structure allows for them to extrude through screens, attach to fibers, process equipment, wires and felts. Stickies can break down and then reagglomerate and appear at seemingly any place in the mill. When subjected to a number of factors including changes in pH, temperature, concentration, charge, and shear forces, stickies can deposit [3]. These deposits can lead to decreased runnability, productivity and expensive downtime. If the stickie remains in the stock, then machine breaks can be common. Finally, if the stickie is not removed or deposited, it will either leave in the final product causing converting and printing problems or recirculate within the mill. It has been estimated that stickies cost the paper industry between $600 and $700 million a year due to the cost of control methods and lost production attributed to stickies [3]. Also, of the seven recycling mills opened in the United States between 1994 and 1997, four have closed citing stickies as the main reason responsible for the closure [4]. Adhesives are widely used throughout the paper and paperboard industry and are subsequently found in the recycled pulp furnish. Hodgson stated that even the best stock preparation process can only remove 99% of the contaminants, of which the remaining 1% is usually adhesives of various types which are usually 10-150 microns in effective diameter [5]. The large particles are removed by mechanical means such as cleaners and screens, and the smaller, colloidal particles can be removed with washing. The stickies that pass through the cleaning and screening processes cause 95% of the problems associated with recycling [6]. The cleaners will remove most of the stickies that have a density varying from the pulp slurry ({approx}1.0 g/cm3) and will accept stickies with densities ranging from 0.95-1.05 g/cm3 [2]. The hydrophobicity of the material is also an important characteristic of the stickie [7]. The hydrophobicity causes the stickies to agglomerate with other hydrophobic materials such as other stickies, lignin, and even pitch. The tacky and viscous nature of stickies contributes to many product and process problems, negatively affecting the practicality of recycled fiber use. The source of stickies that evade conventional removal techniques are usually synthetic polymers, including acrylates, styrene butadiene rubber, vinyl acetates, and polypropylene [5,6,8-12]. Sources of these adhesives are usually broken down into categories based on application.

  19. Statistical Reform in School Psychology Research: A Synthesis

    ERIC Educational Resources Information Center

    Swaminathan, Hariharan; Rogers, H. Jane

    2007-01-01

    Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.

  20. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-07-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data to statistical methods and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any data set easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a data set is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology, we chose two large data sets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose the incorporation of this analysis into future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  1. Spider phobics more easily see a spider in morphed schematic pictures

    PubMed Central

    Kolassa, Iris-Tatjana; Buchmann, Arlette; Lauche, Romy; Kolassa, Stephan; Partchev, Ivailo; Miltner, Wolfgang HR; Musial, Frauke

    2007-01-01

    Background Individuals with social phobia are more likely to misinterpret ambiguous social situations as more threatening, i.e. they show an interpretive bias. This study investigated whether such a bias also exists in specific phobia. Methods Individuals with spider phobia or social phobia, spider aficionados and non-phobic controls saw morphed stimuli that gradually transformed from a schematic picture of a flower into a schematic picture of a spider by shifting the outlines of the petals until they turned into spider legs. Participants' task was to decide whether each stimulus was more similar to a spider, a flower or to neither object while EEG was recorded. Results An interpretive bias was found in spider phobia on a behavioral level: with the first opening of the petals of the flower anchor, spider phobics rated the stimuli as more unpleasant and arousing than the control groups and showed an elevated latent trait to classify a stimulus as a spider and a response-time advantage for spider-like stimuli. No cortical correlates on the level of ERPs of this interpretive bias could be identified. However, consistent with previous studies, social and spider phobic persons exhibited generally enhanced visual P1 amplitudes indicative of hypervigilance in phobia. Conclusion Results suggest an interpretive bias and generalization of phobia-specific responses in specific phobia. Similar effects have been observed in other anxiety disorders, such as social phobia and posttraumatic stress disorder. PMID:18021433

  2. Standardization of electrocardiographic interpretive statements: a menu for word processing.

    PubMed Central

    Dower, G. E.; Osborne, J. A.; Machado, H. B.; Stewart, D. E.

    1979-01-01

    Standardization of electrocardiographic interpretive statements is a goal of various coding systems, but word processing has not usually been considered. A simple, easily memorized system for clinical electrocardiography has been developed and used for approximately 60 000 interpretations. It takes the form of a "menu", in which boxes stand for various interpretive statements; the boxes are identified by mnemonics and marked by the interpreter when appropriate. The results provide better standardization, significant decreases in the numbers of descriptive statements and words per interpretation and considerable saving in typing time. Acceptance by the interpreters has been good. Features of the system allow for word processing as part of a polarcardiography computing system. PMID:427688

  3. Making On-line Science Course Materials Easily Translatable and Accessible Worldwide: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.

    2012-02-01

    The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET project team overcame this challenge by creating the Translation Utility. This tool allows a person fluent in both English and another language to easily translate any of the PhET simulations and requires minimal computer expertise. In this paper we discuss the technical issues involved in this software solution, as well as the issues involved in obtaining accurate translations. We share our solutions to many of the unexpected problems we encountered that would apply generally to making on-line scientific course materials available in many different languages, including working with: languages written right-to-left, different character sets, and different conventions for expressing equations, variables, units and scientific notation.

  4. Safe, Effective and Easily Reproducible Fusion Technique for CV Junction Instability

    PubMed Central

    Sannegowda, Raghavendra Bakki

    2015-01-01

    Introduction: The Craniovertebral junction (CVJ) refers to a bony enclosure where the occipital bone surrounds the foramen magnum, the atlas and the axis vertebrae. Because of the complexity of structures, CVJ instability is associated with diagnostic and therapeutic problems. Posterior CV fusion procedures have evolved a lot over the last couple of decades. There has been a lookout for one such surgical procedure which is inherently safe, simple, easily reproducible and biomechanically sound. In our study, we present the initial experience the cases of CV junction instrumentation using O-C1-C2 screw & rod construct operated by the author. Aims and Objectives: The current study is a descriptive analysis of the cases of CVJ instability treated by us with instrumentation using O-C1-C2 screw and rod construct fusion technique. Materials and Methods: It is a retrospective, analytical study in which cases of CV junction instability operated by the author between January 2010 to March 2014 were analysed using various clinical, radiological and outcome parameters. Conclusion: CV junction instrumentation using O-C1-C2 screw and rod construct fusion technique proved to be safe, effective, easily reproducible and biomechanically sound technique which can be adopted by all surgeons who may be at any stage of their learning curve. PMID:25954660

  5. High-power CO2 electric discharge laser with easily ionized substances added

    NASA Astrophysics Data System (ADS)

    Apollonov, V. V.; Vaskovskiy, Y. M.; Zhavoronkov, M. I.; Prokhorov, A. M.; Rovinskiy, R. Y.; Rogalin, V. Y.; Ustinov, N. D.; Firsov, K. N.; Tsenina, I. S.; Yamshchikov, V. A.

    1986-02-01

    Optimization of the parameters of a transverse-discharge CO2 laser as described in a previous study by the authors is investigated. The output characteristics of the laser are optimized by investigating the radiated energy as a function of the length of the active medium as well as the coefficient of reflection of the exit mirror. It is found that by selecting the easily ionized substances and pumping mode properly, and optimizing the cavity, it is possible to obtain efficiencies and unit energy yields from an externally-driven O2 laser that are as good as those of corresponding electroionization systems. The laser used in the study employs an extremely compact electrode design, and requires no low-inductance capacitors in the pumping circuit. Specific output energy of 51 J/1 and electric energy-to-light conversion efficiency of 22% are achieved.

  6. The study on development of easily chewable and swallowable foods for elderly

    PubMed Central

    Kim, Soojeong

    2015-01-01

    BACKGROUND/OBJECTS When the functions involved in the ingestion of food occurs failure, not only loss of enjoyment of eating, it will be faced with protein-energy malnutrition. Dysmasesis and difficulty of swallowing occurs in various diseases, but it may be a major cause of aging, and elderly people with authoring and dysmasesis and difficulty of swallowing in the aging society is expected to increase rapidly. SUBJECTS/METHODS In this study, we carried out a survey targeting nutritionists who work in elderly care facilities, and examined characteristics of offering of foods for elderly and the degree of demand of development of easily chewable and swallowable foods for the elderly who can crush foods and take that by their own tongues, and sometimes have difficulty in drinking water and tea. RESULTS In elderly care facilities, it was found to provide a finely chopped food or ground food that was ground with water in a blender for elderly with dysmasesis. Elderly satisfaction of provided foods is appeared overall low. Results of investigating the applicability of foods for elderly and the reflection will of menus, were showed the highest response rate in a gelification method in molecular gastronomic science technics, and results of investigating the frequent food of the elderly; representative menu of beef, pork, white fish, anchovies and spinach, were showed Korean barbecue beef, hot pepper paste stir fried pork, pan fried white fish, stir fried anchovy, seasoned spinach were the highest offer frequency. CONCLUSIONS This study will provide the fundamentals of the development of easily chewable and swallowable foods, gelification, for the elderly. The study will also illustrate that, in the elderly, food undergone gelification will reduce the risk of swallowing down to the wrong pipe and improve overall food preference. PMID:26244082

  7. Easily processable multimodal spectral converters based on metal oxide/organic-inorganic hybrid nanocomposites.

    PubMed

    Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P; Freitas, Vânia T; André, Paulo S; Carlos, Luis D; Ferreira, Rute A S

    2015-10-01

    This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er(3+), Yb(3+) codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er(3+)- and Yb(3+)-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices. PMID:26374133

  8. Easily processable multimodal spectral converters based on metal oxide/organic—inorganic hybrid nanocomposites

    NASA Astrophysics Data System (ADS)

    Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P.; Freitas, Vânia T.; André, Paulo S.; Carlos, Luis D.; Ferreira, Rute A. S.

    2015-10-01

    This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er3+, Yb3+ codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er3+- and Yb3+-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices.

  9. Interpretation training influences memory for prior interpretations.

    PubMed

    Salemink, Elske; Hertel, Paula; Mackintosh, Bundy

    2010-12-01

    Anxiety is associated with memory biases when the initial interpretation of the event is taken into account. This experiment examined whether modification of interpretive bias retroactively affects memory for prior events and their initial interpretation. Before training, participants imagined themselves in emotionally ambiguous scenarios to which they provided endings that often revealed their interpretations. Then they were trained to resolve the ambiguity in other situations in a consistently positive (n = 37) or negative way (n = 38) before they tried to recall the initial scenarios and endings. Results indicated that memory for the endings was imbued with the emotional tone of the training, whereas memory for the scenarios was unaffected. PMID:21171760

  10. GoCxx: a tool to easily leverage C++ legacy code for multicore-friendly Go libraries and frameworks

    NASA Astrophysics Data System (ADS)

    Binet, Sébastien

    2012-12-01

    Current HENP libraries and frameworks were written before multicore systems became widely deployed and used. From this environment, a ‘single-thread’ processing model naturally emerged but the implicit assumptions it encouraged are greatly impairing our abilities to scale in a multicore/manycore world. Writing scalable code in C++ for multicore architectures, while doable, is no panacea. Sure, C++11 will improve on the current situation (by standardizing on std::thread, introducing lambda functions and defining a memory model) but it will do so at the price of complicating further an already quite sophisticated language. This level of sophistication has probably already strongly motivated analysis groups to migrate to CPython, hoping for its current limitations with respect to multicore scalability to be either lifted (Grand Interpreter Lock removal) or for the advent of a new Python VM better tailored for this kind of environment (PyPy, Jython, …) Could HENP migrate to a language with none of the deficiencies of C++ (build time, deployment, low level tools for concurrency) and with the fast turn-around time, simplicity and ease of coding of Python? This paper will try to make the case for Go - a young open source language with built-in facilities to easily express and expose concurrency - being such a language. We introduce GoCxx, a tool leveraging gcc-xml's output to automatize the tedious work of creating Go wrappers for foreign languages, a critical task for any language wishing to leverage legacy and field-tested code. We will conclude with the first results of applying GoCxx to real C++ code.

  11. Statistical Software.

    ERIC Educational Resources Information Center

    Callamaras, Peter

    1983-01-01

    This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)

  12. Cancer Statistics

    MedlinePLUS

    ... Cancer Statistics Cancer Statistics Cancer has a major impact on society in the United States and across the world. ... makers, health professionals, and researchers to understand the impact of ... poses to the society at large. Statistical trends are also important for ...

  13. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.

  14. Motivating Play Using Statistical Reasoning

    ERIC Educational Resources Information Center

    Cross Francis, Dionne I.; Hudson, Rick A.; Lee, Mi Yeon; Rapacki, Lauren; Vesperman, Crystal Marie

    2014-01-01

    Statistical literacy is essential in everyone's personal lives as consumers, citizens, and professionals. To make informed life and professional decisions, students are required to read, understand, and interpret vast amounts of information, much of which is quantitative. To develop statistical literacy so students are able to make sense of…

  15. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  16. Revisiting the statistical analysis of pyroclast density and porosity data

    NASA Astrophysics Data System (ADS)

    Bernard, B.; Kueppers, U.; Ortiz, H.

    2015-03-01

    Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used characteristics to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any dataset easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a dataset is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology we chose two large datasets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose add the use of this analysis for future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.

  17. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including ? statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. (©)RSNA, 2015. PMID:26466186

  18. How to limit clinical errors in interpretation of data.

    PubMed

    Wright, P; Jansen, C; Wyatt, J C

    1998-11-01

    We all assume that we can understand and correctly interpret what we read. However, interpretation is a collection of subtle processes that are easily influenced by poor presentation or wording of information. This article examines how evidence-based principles of information design can be applied to medical records to enhance clinical understanding and accuracy in interpretation of the detailed data that they contain. PMID:9820319

  19. An easily-achieved time-domain beamformer for ultrafast ultrasound imaging based on compressive sensing.

    PubMed

    Congzhi Wang; Xi Peng; Dong Liang; Yang Xiao; Weibao Qiu; Ming Qian; Hairong Zheng

    2015-08-01

    In ultrafast ultrasound imaging technique, how to maintain the high frame rate, and at the same time to improve the image quality as far as possible, has become a significant issue. Several novel beamforming methods based on compressive sensing (CS) theory have been proposed in previous literatures, but all have their own limitations, such as the excessively large memory consumption and the errors caused by the short-time discrete Fourier transform (STDFT). In this study, a novel CS-based time-domain beamformer for plane-wave ultrasound imaging is proposed and its image quality has been verified to be better than the traditional DAS method and even the popular coherent compounding method on several simulated phantoms. Comparing to the existing CS method, the memory consumption of our method is significantly reduced since the encoding matrix can be sparse-expressed. In addition, the time-delay calculations of the echo signals are directly accomplished in time-domain with a dictionary concept, avoiding the errors induced by the short-time Fourier translation calculation in those frequency-domain methods. The proposed method can be easily implemented on some low-cost hardware platforms, and can obtain ultrasound images with both high frame rate and good image quality, which make it has a great potential for clinical application. PMID:26738024

  20. Cholesteryl ester storage disease: an easily missed diagnosis in oligosymptomatic children.

    PubMed

    Freudenberg, F; Bufler, P; Ensenauer, R; Lohse, P; Koletzko, S

    2013-10-01

    Cholesteryl ester storage disease (CESD) is a rare, autosomal recessively inherited disorder resulting from deficient activity of lysosomal acid lipase (LAL). LAL is the key enzyme hydrolyzing cholesteryl esters and triglycerides stored in lysosomes after LDL receptor-mediated endocytosis. Mutations within the LIPA gene locus on chromosome 10q23.2-q23.3 may result either in the always fatal Wolman disease, where no LAL activity is found, or in the more benign disorder CESD with a reduced enzymatic activity, leading to massive accumulation of cholesteryl esters and triglycerides in many body tissues. CESD affects mostly the liver, the spectrum is ranging from isolated hepatomegaly to liver cirrhosis. Chronic diarrhea has been reported in some pediatric cases, while calcifications of the adrenal glands, the hallmark of Wolman disease, are rarely observed. Hypercholesterolemia and premature atherosclerosis are other typical disease manifestations. Hepatomegaly as a key finding has been reported in all 71 pediatric patients and in 134 of 135 adult cases in the literature. We present a 13-year-old boy with mildly elevated liver enzymes in the absence of hepatomegaly, finally diagnosed with CESD. Under pravastatine treatment, the patient has normal laboratory findings and is clinically unremarkable since 5 years of follow-up. To our knowledge, this is the first pediatric case of genetically and biopsy confirmed CESD without hepatomegaly, suggesting that this diagnosis can be easily missed. It further raises the question about the natural course and the therapy required for this oligosymptomatic form. PMID:24122380

  1. An easily reversible structural change underlies mechanisms enabling desert crust cyanobacteria to survive desiccation.

    PubMed

    Bar-Eyal, Leeat; Eisenberg, Ido; Faust, Adam; Raanan, Hagai; Nevo, Reinat; Rappaport, Fabrice; Krieger-Liszkay, Anja; Sétif, Pierre; Thurotte, Adrien; Reich, Ziv; Kaplan, Aaron; Ohad, Itzhak; Paltiel, Yossi; Keren, Nir

    2015-10-01

    Biological desert sand crusts are the foundation of desert ecosystems, stabilizing the sands and allowing colonization by higher order organisms. The first colonizers of the desert sands are cyanobacteria. Facing the harsh conditions of the desert, these organisms must withstand frequent desiccation-hydration cycles, combined with high light intensities. Here, we characterize structural and functional modifications to the photosynthetic apparatus that enable a cyanobacterium, Leptolyngbya sp., to thrive under these conditions. Using multiple in vivo spectroscopic and imaging techniques, we identified two complementary mechanisms for dissipating absorbed energy in the desiccated state. The first mechanism involves the reorganization of the phycobilisome antenna system, increasing excitonic coupling between antenna components. This provides better energy dissipation in the antenna rather than directed exciton transfer to the reaction center. The second mechanism is driven by constriction of the thylakoid lumen which limits diffusion of plastocyanin to P700. The accumulation of P700(+) not only prevents light-induced charge separation but also efficiently quenches excitation energy. These protection mechanisms employ existing components of the photosynthetic apparatus, forming two distinct functional modes. Small changes in the structure of the thylakoid membranes are sufficient for quenching of all absorbed energy in the desiccated state, protecting the photosynthetic apparatus from photoinhibitory damage. These changes can be easily reversed upon rehydration, returning the system to its high photosynthetic quantum efficiency. PMID:26188375

  2. Easily separated silver nanoparticle-decorated magnetic graphene oxide: Synthesis and high antibacterial activity.

    PubMed

    Zhang, Huai-Zhi; Zhang, Chang; Zeng, Guang-Ming; Gong, Ji-Lai; Ou, Xiao-Ming; Huan, Shuang-Yan

    2016-06-01

    Silver nanoparticle-decorated magnetic graphene oxide (MGO-Ag) was synthesized by doping silver and Fe3O4 nanoparticles on the surface of GO, which was used as an antibacterial agent. MGO-Ag was characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), Energy dispersive X-ray (EDS), X-ray diffraction (XRD), Raman spectroscopy and magnetic property tests. It can be found that magnetic iron oxide nanoparticles and nano-Ag was well dispersed on graphene oxide; and MGO-Ag exhibited excellent antibacterial activity against Escherichia coli and Staphylococcus aureus. Several factors were investigated to study the antibacterial effect of MGO-Ag, such as temperature, time, pH and bacterial concentration. We also found that MGO-Ag maintained high inactivation rates after use six times and can be separated easily after antibacterial process. Moreover, the antibacterial mechanism is discussed and the synergistic effect of GO, Fe3O4 nanoparticles and nano-Ag accounted for high inactivation of MGO-Ag. PMID:26994349

  3. Shaft seals with an easily removable cylinder holder for low-pressure steam turbines

    NASA Astrophysics Data System (ADS)

    Zakharov, A. E.; Rodionov, D. A.; Pimenov, E. V.; Sobolev, A. S.

    2016-01-01

    The article is devoted to the problems that occur at the operation of LPC shaft seals (SS) of turbines, particularly, their bearings. The problems arising from the deterioration of oil-protecting rings of SS and bearings and also the consequences in which they can result are considered. The existing SS housing construction types are considered. Their operational features are specified. A new SS construction type with an easily removable holder is presented. The construction of its main elements is described. The sequence of operations of the repair personnel at the restoration of the new SS type spacings is proposed. The comparative analysis of the new and the existing SS construction types is carried out. The assessment results of the efficiency, the operational convenience, and the economic effect after the installation of the new type seals are given. The conclusions about the offered construction prospects are made by results of the comparative analysis and the carried-out assessment. The main advantage of this design is the possibility of spacings restoration both in SS and in oil-protecting rings during a short-term stop of a turbine, even without its cooling. This construction was successfully tested on the working K-300-23.5 LMP turbine. However, its adaptation for other turbines is quite possible.

  4. Why can organic liquids move easily on smooth alkyl-terminated surfaces?

    PubMed

    Urata, Chihiro; Masheder, Benjamin; Cheng, Dalton F; Miranda, Daniel F; Dunderdale, Gary J; Miyamae, Takayuki; Hozumi, Atsushi

    2014-04-15

    The dynamic dewettability of a smooth alkyl-terminated sol-gel hybrid film surface against 17 probe liquids (polar and nonpolar, with high and low surface tensions) was systematically investigated using contact angle (CA) hysteresis and substrate tilt angle (TA) measurements, in terms of their physicochemical properties such as surface tension, molecular weight/volume, dielectric constant, density, and viscosity. We found that the dynamic dewettability of the hybrid film markedly depended not on the surface tensions but on the dielectric constants of the probe liquids, displaying lower resistance to liquid drop movement with decreasing dielectric constant (? < 30). Interfacial analysis using the sum-frequency generation (SFG) technique confirmed that the conformation of surface-tethered alkyl chains was markedly altered before and after contact with the different types of probe liquids. When probe liquids with low dielectric constants were in contact with our surface, CH3 groups were preferentially exposed at the solid/liquid interface, leading to a reduction in surface energy. Because of such local changes in surface energy at the three-phase contact line of the probe liquid, the contact line can move continuously from low-surface-energy (solid/liquid) areas to surrounding high-surface-energy (solid/air) areas without pinning. Consequently, the organic probe liquids with low dielectric constants can move easily and roll off when tilted only slightly, independent of the magnitude of CAs, without relying on conventional surface roughening and perfluorination. PMID:24660770

  5. Open Window: When Easily Identifiable Genomes and Traits Are in the Public Domain

    PubMed Central

    Angrist, Misha

    2014-01-01

    “One can't be of an enquiring and experimental nature, and still be very sensible.” - Charles Fort [1] As the costs of personal genetic testing “self-quantification” fall, publicly accessible databases housing people's genotypic and phenotypic information are gradually increasing in number and scope. The latest entrant is openSNP, which allows participants to upload their personal genetic/genomic and self-reported phenotypic data. I believe the emergence of such open repositories of human biological data is a natural reflection of inquisitive and digitally literate people's desires to make genomic and phenotypic information more easily available to a community beyond the research establishment. Such unfettered databases hold the promise of contributing mightily to science, science education and medicine. That said, in an age of increasingly widespread governmental and corporate surveillance, we would do well to be mindful that genomic DNA is uniquely identifying. Participants in open biological databases are engaged in a real-time experiment whose outcome is unknown. PMID:24647311

  6. Easily Regenerable Solid Adsorbents Based on Polyamines for Carbon Dioxide Capture from the Air

    SciTech Connect

    Goeppert, A; Zhang, H; Czaun, M; May, RB; Prakash, GKS; Olah, GA; Narayanan, SR

    2014-03-18

    Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle.

  7. Predicting protein interface residues using easily accessible on-line resources.

    PubMed

    Maheshwari, Surabhi; Brylinski, Michal

    2015-11-01

    It has been more than a decade since the completion of the Human Genome Project that provided us with a complete list of human proteins. The next obvious task is to figure out how various parts interact with each other. On that account, we review 10 methods for protein interface prediction, which are freely available as web servers. In addition, we comparatively evaluate their performance on a common data set comprising different quality target structures. We find that using experimental structures and high-quality homology models, structure-based methods outperform those using only protein sequences, with global template-based approaches providing the best performance. For moderate-quality models, sequence-based methods often perform better than those structure-based techniques that rely on fine atomic details. We note that post-processing protocols implemented in several methods quantitatively improve the results only for experimental structures, suggesting that these procedures should be tuned up for computer-generated models. Finally, we anticipate that advanced meta-prediction protocols are likely to enhance interface residue prediction. Notwithstanding further improvements, easily accessible web servers already provide the scientific community with convenient resources for the identification of protein-protein interaction sites. PMID:25797794

  8. Sexual dimorphism in venom chemistry in Tetragnatha spiders is not easily explained by adult niche differences.

    PubMed

    Binford, Greta J; Gillespie, Rosemary G; Maddison, Wayne P

    2016-05-01

    Spider venom composition typically differs between sexes. This pattern is anecdotally thought to reflect differences in adult feeding biology. We used a phylogenetic approach to compare intersexual venom dimorphism between species that differ in adult niche dimorphism. Male and female venoms were compared within and between related species of Hawaiian Tetragnatha, a mainland congener, and outgroups. In some species of Hawaiian Tetragnatha adult females spin orb-webs and adult males capture prey while wandering, while in other species both males and females capture prey by wandering. We predicted that, if venom sexual dimorphism is primarily explained by differences in adult feeding biology, species in which both sexes forage by wandering would have monomorphic venoms or venoms with reduced dimorphism relative to species with different adult feeding biology. However, we found striking sexual dimorphism in venoms of both wandering and orb-weaving Tetragnatha species with males having high molecular weight components in their venoms that were absent in females, and a reduced concentration of low molecular weight components relative to females. Intersexual differences in venom composition within Tetragnatha were significantly larger than in non-Tetragnatha species. Diet composition was not different between sexes. This striking venom dimorphism is not easily explained by differences in feeding ecology or behavior. Rather, we hypothesize that the dimorphism reflects male-specific components that play a role in mating biology possibly in sexual stimulation, nuptial gifts and/or mate recognition. PMID:26908290

  9. [Is an optimistic memory less easily influenced by negative than by positive emotions?].

    PubMed

    Beneyto Molina, Vicent Blai; Fernández-Abascal, Enrique García

    2012-05-01

    This work examines whether a positive personality trait, such as optimism, can reduce bias in differential words recalled after inducing a certain emotion. After showing a list of words with various emotional valences to a group of 59 subjects, a specific emotional state was induced. Subsequently, the subjects were asked to recall the list of words. The results obtained indicated that less optimistic subjects had a tendency to recall and recognize a greater number of negative words when in a negative emotional condition. Statistical significance was reached in the female group's negative word recognition when experiencing negative emotion. PMID:22420345

  10. Development of a numerical atlas of the easily flooded zones by marine immersions of the sandy littoral of Languedoc Roussillon (France)

    NASA Astrophysics Data System (ADS)

    Christophe, Esposito

    2010-05-01

    The Regional Direction of the Infrastructure (France) entrusted to the Technical Studies Center of the Infrastructure (CETE Mediterranee) the study of a numerical atlas of the easily flooded area by marine immersions of the sandy littoral of Languedoc Roussillon. The objective of this paper is to present the methodological results. To do the map making of the easily flooded area by marine immersions (storm), we used several numerical data base. We can list, for example, the "BD Topo Pays" and the aerial photography of the National Geographical Institute (IGN), the geological mapping of the Geological and Mining Researsh Department (BRGM). To complete this data, we have realised a geomorphological interpretation of the littoral with the aerial photography. This naturalist approach can give the geomorphological object (beach, sand dune, ...) of the sandy littoral. Our objective was to determinate the limit about coastal plain (flooded by storm) and the alluvial plain (flooded by overfloowing) and not liable to flooding form. In the first phase of the study, a progressive methodology was used to develop a version of the numerical atlas based on the available geographical data of geomorphological, historical and topographic nature. During the second phase, we have developed this approach on the four french's department (Pyrénées-Orientales, Aude, Hérault and Gard). The result is the map making of the easily flooded area by marine immersions for 230 km of the sandy littoral. This mapping define the geomorphological factor of the littoral. Like this, we can found a qualitative hazard about marine immersions. Keywords : Storm, Marine immersions, Atlas of the easily flooded zones, Languedoc-Roussillon, France

  11. Superomniphobic and easily repairable coatings on copper substrates based on simple immersion or spray processes.

    PubMed

    Rangel, Thomaz C; Michels, Alexandre F; Horowitz, Flávio; Weibel, Daniel E

    2015-03-24

    Textures that resemble typical fern or bracken plant species (dendrite structures) were fabricated for liquid repellency by dipping copper substrates in a single-step process in solutions containing AgNO3 or by a simple spray liquid application. Superhydrophobic surfaces were produced using a solution containing AgNO3 and trimethoxypropylsilane (TMPSi), and superomniphobic surfaces were produced by a two-step procedure, immersing the copper substrate in a AgNO3 solution and, after that, in a solution containing 1H,1H,2H,2H-perfluorodecyltriethoxysilane (PFDTES). The simple functionalization processes can also be used when the superomniphobic surfaces were destroyed by mechanical stress. By immersion of the wrecked surfaces in the above solutions or by the spray method and soft heating, the copper substrates could be easily repaired, regenerating the surfaces' superrepellency to liquids. The micro- and nanoroughness structures generated on copper surfaces by the deposition of silver dendrites functionalized with TMPSi presented apparent contact angles greater than 150° with a contact angle hysteresis lower than 10° when water was used as the test liquid. To avoid total wettability with very low surface tension liquids, such as rapeseed oil and hexadecane, a thin perfluorinated coating of poly(tetrafluoroethylene) (PTFE), produced by physical vapor deposition, was used. A more efficient perfluorinated coating was obtained when PFDTES was used. The superomniphobic surfaces produced apparent contact angles above 150° with all of the tested liquids, including hexadecane, although the contact angle hysteresis with this liquid was above 10°. The coupling of dendritic structures with TMPSi/PTFE or directly by PFDTES coatings was responsible for the superrepellency of the as-prepared surfaces. These simple, fast, and reliable procedures allow the large area, and cost-effective scale fabrication of superrepellent surfaces on copper substrates for various industrial applications with the advantage of easy recovery of the surface repellency after damage. PMID:25714008

  12. Easily Removable Ureteral Catheters for Internal Drainage in Children: A Preliminary Report

    PubMed Central

    Park, Kyung Kgi; Kim, Myung Up; Chung, Mun Su; Lee, Dong Hoon

    2013-01-01

    Purpose We review our experience using a new and easily removable ureteral catheter in patients who underwent complicated ureteral reimplantation. Our goal was to shorten hospital stay and lower anxiety during catheter removal without fear of postoperative ureteral obstruction. Materials and Methods Between April 2009 and September 2010, nine patients who underwent our new method of catheter removal after ureteral reimplantation were enrolled. Patients who underwent simple ureteral reimplantation were excluded from the study. Following ureteral reimplantation, a combined drainage system consisting of a suprapubic cystostomy catheter and a ureteral catheter was installed. Proximal external tubing was clamped with a Hem-o-lok clamp and the rest of the external tubing was eliminated. Data concerning the age and sex of each patient, reason for operation, method of ureteral reimplantation, and postoperative parameters such as length of hospital stay and complications were recorded. Results Of the nine patients, four had refluxing megaureter, four had a solitary or non-functional contralateral kidney and one had ureteral stricture due to a previous anti-reflux operation. The catheter was removed at postoperative week one. The mean postoperative hospital stay was 2.4 days (range 1-4 days), and the mean follow-up was 9.8 months. None of the patients had postoperative ureteral obstructions, and there were no cases of migration or dislodgement of the catheter. Conclusion Our new method for removing the ureteral catheter would shorten hospital stays and lower levels of anxiety when removing ureteral catheters in patients with a high risk of postoperative ureteral obstruction. PMID:23364982

  13. Easily-handled method to isolate mesenchymal stem cells from coagulated human bone marrow samples

    PubMed Central

    Wang, Heng-Xiang; Li, Zhi-Yong; Guo, Zhi-Kun; Guo, Zi-Kuan

    2015-01-01

    AIM: To establish an easily-handled method to isolate mesenchymal stem cells (MSCs) from coagulated human bone marrow samples. METHODS: Thrombin was added to aliquots of seven heparinized human bone marrow samples to mimic marrow coagulation. The clots were untreated, treated with urokinase or mechanically cut into pieces before culture for MSCs. The un-coagulated samples and the clots were also stored at 4?°C for 8 or 16 h before the treatment. The numbers of colony-forming unit-fibroblast (CFU-F) in the different samples were determined. The adherent cells from different groups were passaged and their surface profile was analyzed with flow cytometry. Their capacities of in vitro osteogenesis and adipogenesis were observed after the cells were exposed to specific inductive agents. RESULTS: The average CFU-F number of urokinase-treated samples (16.85 ± 11.77/106) was comparable to that of un-coagulated control samples (20.22 ± 10.65/106, P = 0.293), which was significantly higher than those of mechanically-cut clots (6.5 ± 5.32/106, P < 0.01) and untreated clots (1.95 ± 1.86/106, P < 0.01). The CFU-F numbers decreased after samples were stored, but those of control and urokinase-treated clots remained higher than the other two groups. Consistently, the numbers of the attached cells at passage 0 were higher in control and urokinase-treated clots than those of mechanically-cut clots and untreated clots. The attached cells were fibroblast-like in morphology and homogenously positive for CD44, CD73 and CD90, and negative for CD31 and CD45. Also, they could be induced to differentiate into osteoblasts and adipocytes in vitro. CONCLUSION: Urokinase pretreatment is an optimal strategy to isolate MSCs from human bone marrow samples that are poorly aspirated and clotted. PMID:26435773

  14. Clearly written, easily comprehended? The readability of websites providing information on epilepsy.

    PubMed

    Brigo, Francesco; Otte, Willem M; Igwe, Stanley C; Tezzon, Frediano; Nardone, Raffaele

    2015-03-01

    There is a general need for high-quality, easily accessible, and comprehensive health-care information on epilepsy to better inform the general population about this highly stigmatized neurological disorder. The aim of this study was to evaluate the health literacy level of eight popular English-written websites that provide information on epilepsy in quantitative terms of readability. Educational epilepsy material on these websites, including 41 Wikipedia articles, were analyzed for their overall level of readability and the corresponding academic grade level needed to comprehend the published texts on the first reading. The Flesch Reading Ease (FRE) was used to assess ease of comprehension while the Gunning Fog Index, Coleman-Liau Index, Flesch-Kincaid Grade Level, Automated Readability Index, and Simple Measure of Gobbledygook scales estimated the corresponding academic grade level needed for comprehension. The average readability of websites yielded results indicative of a difficult-to-fairly-difficult readability level (FRE results: 44.0±8.2), with text readability corresponding to an 11th academic grade level (11.3±1.9). The average FRE score of the Wikipedia articles was indicative of a difficult readability level (25.6±9.5), with the other readability scales yielding results corresponding to a 14th grade level (14.3±1.7). Popular websites providing information on epilepsy, including Wikipedia, often demonstrate a low level of readability. This can be ameliorated by increasing access to clear and concise online information on epilepsy and health in general. Short "basic" summaries targeted to patients and nonmedical users should be added to articles published in specialist websites and Wikipedia to ease readability. PMID:25601720

  15. The Sclerotic Scatter Limbal Arc Is More Easily Elicited under Mesopic Rather Than Photopic Conditions

    PubMed Central

    Denion, Eric; Lux, Anne-Laure; Mouriaux, Frédéric; Béraud, Guillaume

    2016-01-01

    Introduction We aimed to determine the limbal lighting illuminance thresholds (LLITs) required to trigger perception of sclerotic scatter at the opposite non-illuminated limbus (i.e. perception of a light limbal scleral arc) under different levels of ambient lighting illuminance (ALI). Material and Methods Twenty healthy volunteers were enrolled. The iris shade (light or dark) was graded by retrieving the median value of the pixels of a pre-determined zone of a gray-level iris photograph. Mean keratometry and central corneal pachymetry were recorded. Each subject was asked to lie down, and the ALI at eye level was set to mesopic values (10, 20, 40 lux), then photopic values (60, 80, 100, 150, 200 lux). For each ALI level, a light beam of gradually increasing illuminance was applied to the right temporal limbus until the LLIT was reached, i.e. the level required to produce the faint light arc that is characteristic of sclerotic scatter at the nasal limbus. Results After log-log transformation, a linear relationship between the logarithm of ALI and the logarithm of the LLIT was found (p<0.001), a 10% increase in ALI being associated with an average increase in the LLIT of 28.9%. Higher keratometry values were associated with higher LLIT values (p = 0.008) under low ALI levels, but the coefficient of the interaction was very small, representing a very limited effect. Iris shade and central corneal thickness values were not significantly associated with the LLIT. We also developed a censored linear model for ALI values ≤ 40 lux, showing a linear relationship between ALI and the LLIT, in which the LLIT value was 34.4 times greater than the ALI value. Conclusion Sclerotic scatter is more easily elicited under mesopic conditions than under photopic conditions and requires the LLIT value to be much higher than the ALI value, i.e. it requires extreme contrast. PMID:26964096

  16. Interpretation in Sweden.

    ERIC Educational Resources Information Center

    Hultman, Sven-G.

    1987-01-01

    Describes some of the interpretive developments underway in Sweden. Discusses some programs in both natural and cultural interpretation. Calls for increasing the purpose and content of heritage preservation and conservation to the general public. (TW)

  17. Journalists as Interpretive Communities.

    ERIC Educational Resources Information Center

    Zelizer, Barbie

    1993-01-01

    Proposes viewing journalists as members of an interpretive community (not a profession) united by its shared discourse and collective interpretations of key public events. Applies the frame of the interpretive community to journalistic discourse about two events central for American journalists--Watergate and McCarthyism. (SR)

  18. Translation and Interpretation.

    ERIC Educational Resources Information Center

    Nicholson, Nancy Schweda

    1995-01-01

    Examines recent trends in the fields of translation and interpretation, focusing on translation and interpretation theory and practice, language-specific challenges, computer-assisted translation, machine translation, subtitling, and translator and interpreter training. An annotated bibliography discusses seven important works in the field. (112…

  19. Interpreting. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    Darroch, Kathy; Marshall, Liza

    1998-01-01

    An interpreter's role is to facilitate communication and convey all auditory and signed information so that both hearing and deaf individuals may fully interact. The common types of services provided by interpreters are: (1) American Sign Language (ASL) Interpretation--a visual-gestural language with its own linguistic features; (2) Sign Language…

  20. Interpreting. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    Darroch, Kathleen

    2010-01-01

    An interpreter's role is to facilitate communication and convey all auditory and signed information so that both hearing and deaf individuals may fully interact. The common types of services provided by interpreters are: (1) American Sign Language (ASL) Interpretation--a visual-gestural language with its own linguistic features; (2) Sign Language…

  1. Interpretation of cancer prevention trials.

    PubMed

    Moon, T E

    1989-09-01

    Principles and methods to guide interpretation require a different emphasis for cancer trials. Assumptions used to design a trial must be validated and modified during the trial to avoid limitations. To maximize information from such trials, recruitment strategies for commonly free-living subjects, measurements of safety and compliance, and ascertainment with pathologic review of endpoints must be obtained. Consideration of multiple endpoints may provide a better interpretation of cancer prevention for skin, colon, and transient occurrences illustrated by cervical dysplasia or biochemical precursors. A careful definition of the limitations of preventive trials is required. These include the actual size of the intervention groups, completeness and duration of follow-up, and comparison between trial participants and a defined source population. To obtain a valid interpretation with adequate precision of intervention effectiveness, time to endpoints should be evaluated using statistical multivariate methods such as Cox proportional hazard or relative risk models. These permit adjustment for important confounding and risk modifiers such as compliance, dietary intake, and drift in control group. The magnitude of the intervention efficacy and the generalizability of results of the trial will be negatively impacted if the intervention has a delayed (latent) effect. Such delay in intervention effect requires added considerations with possible extension of trial duration. Use of confidence limits for intervention effectiveness provides added insight and improved interpretation of prevention trials. The final component of a cancer prevention trial, as with any study, is to interpret and report its results. Providing a valid interpretation with adequate precision to hypotheses of a cancer prevention trial requires added emphasis on the accuracy of the assumptions made to design the trial and the duration of the trial. Design assumptions regarding compliance to the prescribed interventions, time until the experimental intervention achieves full effect, and the frequency of endpoints directly impact on the number of endpoints observed. Terminating a cancer prevention trial before adequate information is obtained, thus severely flawing its interpretation, requires ongoing awareness. Interpretation of a cancer prevention trial should include several added steps: first, investigators to critically review the actual manner in which the trial was conducted; second, carry out an appropriate analysis of the data; and third, review the results and note exceptions or limitations in the data. The results of the trial should be contrasted with previous studies. Implications of the results to future trials should be considered. Finally, these interpretations should be documented in a written report and made available to the scientific community. PMID:2694166

  2. Enhancing Table Interpretation Skills via Training in Table Creation

    ERIC Educational Resources Information Center

    Karazsia, Bryan T.

    2013-01-01

    Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents a new technique for enhancing student interpretation of American Psychological…

  3. Health Statistics

    MedlinePLUS

    ... them all the time in the news - the number of people who were in the hospital last year, the ... all types of health statistics. Health statistics are numbers about some ... of diseases in groups of people. This can help in figuring out who is ...

  4. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the statistical…

  5. Interpreting Abstract Interpretations in Membership Equational Logic

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Rosu, Grigore

    2001-01-01

    We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.

  6. Parametric trial-by-trial prediction of pain by easily available physiological measures.

    PubMed

    Geuter, Stephan; Gamer, Matthias; Onat, Selim; Büchel, Christian

    2014-05-01

    Pain is commonly assessed by subjective reports on rating scales. However, in many experimental and clinical settings, an additional, objective indicator of pain is desirable. In order to identify an objective, parametric signature of pain intensity that is predictive at the individual stimulus level across subjects, we recorded skin conductance and pupil diameter responses to heat pain stimuli of different durations and temperatures in 34 healthy subjects. The temporal profiles of trial-wise physiological responses were characterized by component scores obtained from principal component analysis. These component scores were then used as predictors in a linear regression analysis, resulting in accurate pain predictions for individual trials. Using the temporal information encoded in the principal component scores explained the data better than prediction by a single summary statistic (i.e., maximum amplitude). These results indicate that perceived pain is best reflected by the temporal dynamics of autonomic responses. Application of the regression model to an independent data set of 20 subjects resulted in a very good prediction of the pain ratings demonstrating the generalizability of the identified temporal pattern. Utilizing the readily available temporal information from skin conductance and pupil diameter responses thus allows parametric prediction of pain in human subjects. PMID:24525275

  7. Preparation and Use of an Easily Constructed, Inexpensive Chamber for Viewing Courtship Behaviors of Fruit Flies, Drosophila sp.

    ERIC Educational Resources Information Center

    Christensen, Timothy J.; Labov, Jay B.

    1997-01-01

    Details the construction of a viewing chamber for fruit flies that connects to a dissecting microscope and features a design that enables students to easily move fruit flies in and out of the chamber. (DDR)

  8. Basic statistics in cell biology.

    PubMed

    Vaux, David L

    2014-01-01

    The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind. PMID:25000992

  9. Perception in statistical graphics

    NASA Astrophysics Data System (ADS)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  10. Interpretation biases in paranoia.

    PubMed

    Savulich, George; Freeman, Daniel; Shergill, Sukhi; Yiend, Jenny

    2015-01-01

    Information in the environment is frequently ambiguous in meaning. Emotional ambiguity, such as the stare of a stranger, or the scream of a child, encompasses possible good or bad emotional consequences. Those with elevated vulnerability to affective disorders tend to interpret such material more negatively than those without, a phenomenon known as "negative interpretation bias." In this study we examined the relationship between vulnerability to psychosis, measured by trait paranoia, and interpretation bias. One set of material permitted broadly positive/negative (valenced) interpretations, while another allowed more or less paranoid interpretations, allowing us to also investigate the content specificity of interpretation biases associated with paranoia. Regression analyses (n=70) revealed that trait paranoia, trait anxiety, and cognitive inflexibility predicted paranoid interpretation bias, whereas trait anxiety and cognitive inflexibility predicted negative interpretation bias. In a group comparison those with high levels of trait paranoia were negatively biased in their interpretations of ambiguous information relative to those with low trait paranoia, and this effect was most pronounced for material directly related to paranoid concerns. Together these data suggest that a negative interpretation bias occurs in those with elevated vulnerability to paranoia, and that this bias may be strongest for material matching paranoid beliefs. We conclude that content-specific biases may be important in the cause and maintenance of paranoid symptoms. PMID:25526839

  11. SEER Statistics

    Cancer.gov

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  12. Quick Statistics

    MedlinePLUS

    ... population, or about 25 million Americans, has experienced tinnitus lasting at least five minutes in the past ... by NIDCD Epidemiology and Statistics Program staff: (1) tinnitus prevalence was obtained from the 2008 National Health ...

  13. Statistical Physics

    NASA Astrophysics Data System (ADS)

    Hermann, Claudine

    Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies - such as semiconductors or lasers - are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.

  14. The impact of easily oxidized material (EOM) on the meiobenthos: foraminifera abnormalities in shrimp ponds of New Caledonia; implications for environment and paleoenvironment survey.

    PubMed

    Debenay, J-P; Della Patrona, L; Herbland, A; Goguenheim, H

    2009-01-01

    This study was carried out in shrimp ponds from New Caledonia, in order to determine the cause of the exceptional proportion of abnormal tests (FAI) (often >50%, sometimes >80%). FAI was positively correlated to the quantity of easily oxidized material (EOM) deposited on the bottom of the ponds and to the sediment oxygen demand, and negatively correlated to redox. These results suggest that a very high FAI is a potential indicator for great accumulations of native organic matter, leading to a high sediment oxygen demand. When studying ancient sediments in core samples, exceptional abundances of abnormal tests may indicate periods of high accumulation of EOM, and therefore of oxygen depletion. This finding should help in better management of aquaculture ponds, but should also allow new insight into the interpretation of sedimentary records, providing a useful proxy for paleoenvironmental reconstructions. PMID:19735926

  15. Interpretation domestic and foreign.

    PubMed

    Vega, Jason A Wheeler

    2012-10-01

    Verbal and nonverbal behavior are on all fours when it comes to interpretation. This idea runs counter to an intuition that, to borrow a phrase, speech is cooked but action is raw. The author discusses some of the most compelling psychoanalytic work on the interpretation of action and presents empirical and philosophical findings about understanding speech. These concepts generate reciprocal implications about the possibility of interpreting the exotics of action and the necessity of interpreting the domestics of speech, treating both as equally dignified aspects of human behavior. The author presents a number of clinical examples to further illustrate these ideas. PMID:23326999

  16. Statistical phylogeography.

    PubMed

    Knowles, L Lacey; Maddison, Wayne P

    2002-12-01

    While studies of phylogeography and speciation in the past have largely focused on the documentation or detection of significant patterns of population genetic structure, the emerging field of statistical phylogeography aims to infer the history and processes underlying that structure, and to provide objective, rather than ad hoc explanations. Methods for parameter estimation are now commonly used to make inferences about demographic past. Although these approaches are well developed statistically, they typically pay little attention to geographical history. In contrast, methods that seek to reconstruct phylogeographic history are able to consider many alternative geographical scenarios, but are primarily nonstatistical, making inferences about particular biological processes without explicit reference to stochastically derived expectations. We advocate the merging of these two traditions so that statistical phylogeographic methods can provide an accurate representation of the past, consider a diverse array of processes, and yet yield a statistical estimate of that history. We discuss various conceptual issues associated with statistical phylogeographic inferences, considering especially the stochasticity of population genetic processes and assessing the confidence of phylogeographic conclusions. To this end, we present some empirical examples that utilize a statistical phylogeographic approach, and then by contrasting results from a coalescent-based approach to those from Templeton's nested cladistic analysis (NCA), we illustrate the importance of assessing error. Because NCA does not assess error in its inferences about historical processes or contemporary gene flow, we performed a small-scale study using simulated data to examine how our conclusions might be affected by such unconsidered errors. NCA did not identify the processes used to simulate the data, confusing among deterministic processes and the stochastic sorting of gene lineages. There is as yet insufficient justification of NCA's ability to accurately infer or distinguish among alternative processes. We close with a discussion of some unresolved problems of current statistical phylogeographic methods to propose areas in need of future development. PMID:12453245

  17. Crying without a cause and being easily upset in two-year-olds: heritability and predictive power of behavioral problems.

    PubMed

    Groen-Blokhuis, Maria M; Middeldorp, Christel M; M van Beijsterveldt, Catharina E; Boomsma, Dorret I

    2011-10-01

    In order to estimate the influence of genetic and environmental factors on 'crying without a cause' and 'being easily upset' in 2-year-old children, a large twin study was carried out. Prospective data were available for ~18,000 2-year-old twin pairs from the Netherlands Twin Register. A bivariate genetic analysis was performed using structural equation modeling in the Mx software package. The influence of maternal personality characteristics and demographic and lifestyle factors was tested to identify specific risk factors that may underlie the shared environment of twins. Furthermore, it was tested whether crying without a cause and being easily upset were predictive of later internalizing, externalizing and attention problems. Crying without a cause yielded a heritability estimate of 60% in boys and girls. For easily upset, the heritability was estimated at 43% in boys and 31% in girls. The variance explained by shared environment varied between 35% and 63%. The correlation between crying without a cause and easily upset (r = .36) was explained both by genetic and shared environmental factors. Birth cohort, gestational age, socioeconomic status, parental age, parental smoking behavior and alcohol use during pregnancy did not explain the shared environmental component. Neuroticism of the mother explained a small proportion of the additive genetic, but not of the shared environmental effects for easily upset. Crying without a cause and being easily upset at age 2 were predictive of internalizing, externalizing and attention problems at age 7, with effect sizes of .28-.42. A large influence of shared environmental factors on crying without a cause and easily upset was detected. Although these effects could be specific to these items, we could not explain them by personality characteristics of the mother or by demographic and lifestyle factors, and we recognize that these effects may reflect other maternal characteristics. A substantial influence of genetic factors was found for the two items, which are predictive of later behavioral problems. PMID:21962130

  18. Interpretation of psychophysics response curves using statistical physics.

    PubMed

    Knani, S; Khalfaoui, M; Hachicha, M A; Mathlouthi, M; Ben Lamine, A

    2014-05-15

    Experimental gustatory curves have been fitted for four sugars (sucrose, fructose, glucose and maltitol), using a double layer adsorption model. Three parameters of the model are fitted, namely the number of molecules per site n, the maximum response RM and the concentration at half saturation C1/2. The behaviours of these parameters are discussed in relationship to each molecule's characteristics. Starting from the double layer adsorption model, we determined (in addition) the adsorption energy of each molecule on taste receptor sites. The use of the threshold expression allowed us to gain information about the adsorption occupation rate of a receptor site which fires a minimal response at a gustatory nerve. Finally, by means of this model we could calculate the configurational entropy of the adsorption system, which can describe the order and disorder of the adsorbent surface. PMID:24423561

  19. Prosody and Interpretation

    ERIC Educational Resources Information Center

    Erekson, James A.

    2010-01-01

    Prosody is a means for "reading with expression" and is one aspect of oral reading competence. This theoretical inquiry asserts that prosody is central to interpreting text, and draws distinctions between "syntactic" prosody (for phrasing) and "emphatic" prosody (for interpretation). While reading with expression appears as a criterion in major…

  20. Using M&Ms to Develop Statistical Literacy

    ERIC Educational Resources Information Center

    Marshall, Linda; Swan, Paul

    2006-01-01

    Statistical literacy is defined as "the ability to read and interpret data: the ability to use statistics as evidence in arguments. Statistical literacy is a competency: the ability to think critically about statistics" (Schield, p. 2). When a definition of statistical literacy is considered it can be seen that all students can manage a level of…

  1. Statistical Fun

    ERIC Educational Resources Information Center

    Catley, Alan

    2007-01-01

    Following the announcement last year that there will be no more math coursework assessment at General Certificate of Secondary Education (GCSE), teachers will in the future be able to devote more time to preparing learners for formal examinations. One of the key things that the author has learned when teaching statistics is that it makes for far…

  2. Educational Statistics.

    ERIC Educational Resources Information Center

    Penfield, Douglas A.

    The 30 papers in the area of educational statistics that were presented at the 1972 AERA Conference are reviewed. The papers are categorized into five broad areas of interest: (1) theory of univariate analysis, (2) nonparametric methods, (3) regression-prediction theory, (4) multivariable methods, and (5) factor analysis. A list of the papers…

  3. Statistics Revelations

    ERIC Educational Resources Information Center

    Chicot, Katie; Holmes, Hilary

    2012-01-01

    The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…

  4. Theory Interpretations in PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)

    2001-01-01

    The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.

  5. Conventional statistics and useful statistics.

    PubMed

    Rahlfs, V W

    1995-02-01

    Differences between conventional statistical methods and more useful, modern methods are demonstrated using a statistical analysis of data from therapeutic research in rheumatology. The conventional methods, t-test and graphs of mean values and the boxplot, detect almost no differences between treatment groups. A more recent procedure for analysing group differences is the Wilcoxon-Mann-Whitney test. The associated graphs are based on the cumulative distribution function of the two treatment groups and the synthetic Receiver Operating Characteristic (ROC). Special differences, namely baseline dependencies, can be visualized in this way. PMID:7710451

  6. Smartphones for post-event analysis: a low-cost and easily accessible approach for mapping natural hazards

    NASA Astrophysics Data System (ADS)

    Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo

    2015-04-01

    A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar technology. Anyone (included farmers, technicians or who work at Civil Protection) who has a good smartphone can take photographs and, from these photographs, they can easily obtain high-resolution DSMs. Therefore, SfM technique accomplished with smartphones can be a very strategic tool for post-event field surveys, to increase the existing knowledge on such events, and to provide fast technical solutions for risk mitigation (e.g. landslide and flood risk management). The future challenge consists of using only a smartphone for local scale post-event analyses. This can be even enhanced by the development of specific apps that are able to build quickly a 3D view of the case study and arrange a preliminary quantitative analysis of the process involved, ready to be sent to Civil Protection for further elaborations. Tarolli, P. (2014). High-resolution topography for understanding Earth surface processes: opportunities and challenges. Geomorphology, 216, 295-312, doi:10.1016/j.geomorph.2014.03.008.

  7. Double copper sheath multiconductor instrumentation cable is durable and easily installed in high thermal or nuclear radiation area

    NASA Technical Reports Server (NTRS)

    Mc Crae, A. W., Jr.

    1967-01-01

    Multiconductor instrumentation cable in which the conducting wires are routed through two concentric copper tube sheaths, employing a compressed insulator between the conductors and between the inner and outer sheaths, is durable and easily installed in high thermal or nuclear radiation area. The double sheath is a barrier against moisture, abrasion, and vibration.

  8. Statistics Poster Challenge for Schools

    ERIC Educational Resources Information Center

    Payne, Brad; Freeman, Jenny; Stillman, Eleanor

    2013-01-01

    The analysis and interpretation of data are important life skills. A poster challenge for schoolchildren provides an innovative outlet for these skills and demonstrates their relevance to daily life. We discuss our Statistics Poster Challenge and the lessons we have learned.

  9. Women in Academic Medicine: Statistics.

    ERIC Educational Resources Information Center

    Bickel, Janet; And Others

    This document consists of an interpretive overview and statistical data about women in medicine. Nine tables and three figures are presented. The tables are organized as follows: (1) Women Applicants, Enrollees and Graduates--Selected Years, 1949-50 through 1993-94; (2) Comparative Acceptance Data for Men and Women Applicants, 1974-75 through…

  10. Singular statistics.

    PubMed

    Bogomolny, E; Gerland, U; Schmit, C

    2001-03-01

    We consider the statistical distribution of zeros of random meromorphic functions whose poles are independent random variables. It is demonstrated that correlation functions of these zeros can be computed analytically, and explicit calculations are performed for the two-point correlation function. This problem naturally appears in, e.g., rank-1 perturbation of an integrable Hamiltonian and, in particular, when a delta-function potential is added to an integrable billiard. PMID:11308740

  11. Interpreting Weather Maps.

    ERIC Educational Resources Information Center

    Smith, P. Sean; Ford, Brent A.

    1994-01-01

    Presents a brief introduction of our atmosphere, a guide to reading and interpreting weather maps, and a set of activities to facilitate teachers in helping to enhance student understanding of the Earth's atmosphere. (ZWH)

  12. Interpretation of Biosphere Reserves.

    ERIC Educational Resources Information Center

    Merriman, Tim

    1994-01-01

    Introduces the Man and the Biosphere Programme (MAB) to monitor the 193 biogeographical provinces of the Earth and the creation of biosphere reserves. Highlights the need for interpreters to become familiar or involved with MAB program activities. (LZ)

  13. Interpretation of Bernoulli's Equation.

    ERIC Educational Resources Information Center

    Bauman, Robert P.; Schwaneberg, Rolf

    1994-01-01

    Discusses Bernoulli's equation with regards to: horizontal flow of incompressible fluids, change of height of incompressible fluids, gases, liquids and gases, and viscous fluids. Provides an interpretation, properties, terminology, and applications of Bernoulli's equation. (MVL)

  14. Easily Accessible Camera Mount

    NASA Technical Reports Server (NTRS)

    Chalson, H. E.

    1986-01-01

    Modified mount enables fast alinement of movie cameras in explosionproof housings. Screw on side and readily reached through side door of housing. Mount includes right-angle drive mechanism containing two miter gears that turn threaded shaft. Shaft drives movable dovetail clamping jaw that engages fixed dovetail plate on camera. Mechanism alines camera in housing and secures it. Reduces installation time by 80 percent.

  15. Easily repairable networks

    NASA Astrophysics Data System (ADS)

    Fink, Thomas

    2015-03-01

    We introduce a simple class of distribution networks which withstand damage by being repairable instead of redundant. Instead of asking how hard it is to disconnect nodes through damage, we ask how easy it is to reconnect nodes after damage. We prove that optimal networks on regular lattices have an expected cost of reconnection proportional to the lattice length, and that such networks have exactly three levels of structural hierarchy. We extend our results to networks subject to repeated attacks, in which the repairs themselves must be repairable. We find that, in exchange for a modest increase in repair cost, such networks are able to withstand any number of attacks. We acknowledge support from the Defense Threat Reduction Agency, BCG and EU FP7 (Growthcom).

  16. Customizable tool for ecological data entry, assessment, monitoring, and interpretation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...

  17. Interpreter-mediated dentistry.

    PubMed

    Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F

    2015-05-01

    The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals. PMID:25828074

  18. Interpreting the radon transform using Prolog

    NASA Astrophysics Data System (ADS)

    Batchelor, Bruce G.

    1992-03-01

    The Radon transform is an important method for identifying linear features in a digital image. However, the images which the Radon transform generates are complex and require intelligent interpretation, to identify lines in the input image correctly. This article describes how the images can be pre-processed to make the spots in the Radon transform image more easily identified and describes Prolog programs which can recognize constellations of points in the Radon transform image and thereby identify geometric figures within the input image.

  19. Nationally consistent and easily-implemented approach to evaluate littoral-riparian habitat quality in lakes and reservoirs

    EPA Science Inventory

    The National Lakes Assessment (NLA) and other lake survey and monitoring efforts increasingly rely upon biological assemblage data to define lake condition. Information concerning the multiple dimensions of physical and chemical habitat is necessary to interpret this biological ...

  20. Modification of codes NUALGAM and BREMRAD. Volume 3: Statistical considerations of the Monte Carlo method

    NASA Technical Reports Server (NTRS)

    Firstenberg, H.

    1971-01-01

    The statistics are considered of the Monte Carlo method relative to the interpretation of the NUGAM2 and NUGAM3 computer code results. A numerical experiment using the NUGAM2 code is presented and the results are statistically interpreted.

  1. Geological interpretation of a Gemini photo

    USGS Publications Warehouse

    Hemphill, William R.; Danilchik, Walter

    1968-01-01

    Study of the Gemini V photograph of the Salt Range and Potwar Plateau, West Pakistan, indicates that small-scale orbital photographs permit recognition of the regional continuity of some geologic features, particularly faults and folds that could he easily overlooked on conventional air photographs of larger scale. Some stratigraphic relationships can also be recognized on the orbital photograph, but with only minimal previous geologic knowledge of the area, these interpretations are less conclusive or reliable than the interpretation of structure. It is suggested that improved atmospheric penetration could be achieved through the use of color infrared film. Photographic expression of topography could also be improved by deliberately photographing some areas during periods of low sun angle.

  2. Interpreting wireline measurements in coal beds

    SciTech Connect

    Johnston, D.J. )

    1991-06-01

    When logging coal seams with wireline tools, the interpretation method needed to evaluate the coals is different from that used for conventional oil and gas reservoirs. Wireline logs identify coals easily. For an evaluation, the contribution of each coal component on the raw measurements must be considered. This paper will discuss how each log measurement is affected by each component. The components of a coal will be identified as the mineral matter, macerals, moisture content, rank, gas content, and cleat porosity. The measurements illustrated are from the resistivity, litho-density, neutron, sonic, dielectric, and geochemical tools. Once the coal component effects have been determined, an interpretation of the logs can be made. This paper will illustrate how to use these corrected logs in a coal evaluation.

  3. Highly concentrated synthesis of copper-zinc-tin-sulfide nanocrystals with easily decomposable capping molecules for printed photovoltaic applications.

    PubMed

    Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho

    2013-11-01

    Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination. PMID:24057000

  4. [Effect of diets with easily assimilated carbohydrates on the lipid composition of tissues of young rats with alloxan diabetes].

    PubMed

    Pogorelova, T N; Dluzhevskaia, T S; Drukker, N A; Ostashevskaia, M I; Afanas'eva, N B

    1987-01-01

    The blood serum, liver, cerebral and pancreatic tissues of 250 rats aged 1 to 1.5 mos. were investigated. Alloxan diabetes caused profound changes in the lipid composition of various tissues, the most noticeable ones in the pancreas. Of all easily assimilable carbohydrates used as admixtures to a common ration fructose followed by xylite and sorbitol produced the most unfavorable effect on lipid metabolism. The best results were obtained by adding up glucose. PMID:3658954

  5. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  6. SOCR: Statistics Online Computational Resource.

    PubMed

    Dinov, Ivo D

    2006-10-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning. PMID:21451741

  7. Ta3N5-Pt nonwoven cloth with hierarchical nanopores as efficient and easily recyclable macroscale photocatalysts.

    PubMed

    Li, Shijie; Zhang, Lisha; Wang, Huanli; Chen, Zhigang; Hu, Junqing; Xu, Kaibing; Liu, Jianshe

    2014-01-01

    Traditional nanosized photocatalysts usually have high photocatalytic activity but can not be efficiently recycled. Film-shaped photocatalysts on the substrates can be easily recycled, but they have low surface area and/or high production cost. To solve these problems, we report on the design and preparation of efficient and easily recyclable macroscale photocatalysts with nanostructure by using Ta3N5 as a model semiconductor. Ta3N5-Pt nonwoven cloth has been prepared by an electrospinning-calcination-nitridation-wet impregnation method, and it is composed of Ta3N5 fibers with diameter of 150-200?nm and hierarchical pores. Furthermore, these fibers are constructed from Ta3N5 nanoparticles with diameter of ~25?nm which are decorated with Pt nanoparticles with diameter of ~2.5?nm. Importantly, Ta3N5-Pt cloth can be used as an efficient and easily recyclable macroscale photocatalyst with wide visible-light response, for the degradation of methylene blue and parachlorophenol, probably resulting in a very promising application as "photocatalyst dam" for the polluted river. PMID:24496147

  8. Ta3N5-Pt nonwoven cloth with hierarchical nanopores as efficient and easily recyclable macroscale photocatalysts

    NASA Astrophysics Data System (ADS)

    Li, Shijie; Zhang, Lisha; Wang, Huanli; Chen, Zhigang; Hu, Junqing; Xu, Kaibing; Liu, Jianshe

    2014-02-01

    Traditional nanosized photocatalysts usually have high photocatalytic activity but can not be efficiently recycled. Film-shaped photocatalysts on the substrates can be easily recycled, but they have low surface area and/or high production cost. To solve these problems, we report on the design and preparation of efficient and easily recyclable macroscale photocatalysts with nanostructure by using Ta3N5 as a model semiconductor. Ta3N5-Pt nonwoven cloth has been prepared by an electrospinning-calcination-nitridation-wet impregnation method, and it is composed of Ta3N5 fibers with diameter of 150-200 nm and hierarchical pores. Furthermore, these fibers are constructed from Ta3N5 nanoparticles with diameter of ~25 nm which are decorated with Pt nanoparticles with diameter of ~2.5 nm. Importantly, Ta3N5-Pt cloth can be used as an efficient and easily recyclable macroscale photocatalyst with wide visible-light response, for the degradation of methylene blue and parachlorophenol, probably resulting in a very promising application as ``photocatalyst dam'' for the polluted river.

  9. Ta3N5-Pt nonwoven cloth with hierarchical nanopores as efficient and easily recyclable macroscale photocatalysts

    PubMed Central

    Li, Shijie; Zhang, Lisha; Wang, Huanli; Chen, Zhigang; Hu, Junqing; Xu, Kaibing; Liu, Jianshe

    2014-01-01

    Traditional nanosized photocatalysts usually have high photocatalytic activity but can not be efficiently recycled. Film-shaped photocatalysts on the substrates can be easily recycled, but they have low surface area and/or high production cost. To solve these problems, we report on the design and preparation of efficient and easily recyclable macroscale photocatalysts with nanostructure by using Ta3N5 as a model semiconductor. Ta3N5-Pt nonwoven cloth has been prepared by an electrospinning-calcination-nitridation-wet impregnation method, and it is composed of Ta3N5 fibers with diameter of 150–200 nm and hierarchical pores. Furthermore, these fibers are constructed from Ta3N5 nanoparticles with diameter of ~25 nm which are decorated with Pt nanoparticles with diameter of ~2.5 nm. Importantly, Ta3N5-Pt cloth can be used as an efficient and easily recyclable macroscale photocatalyst with wide visible-light response, for the degradation of methylene blue and parachlorophenol, probably resulting in a very promising application as “photocatalyst dam” for the polluted river. PMID:24496147

  10. Hold My Calls: An Activity for Introducing the Statistical Process

    ERIC Educational Resources Information Center

    Abel, Todd; Poling, Lisa

    2015-01-01

    Working with practicing teachers, this article demonstrates, through the facilitation of a statistical activity, how to introduce and investigate the unique qualities of the statistical process including: formulate a question, collect data, analyze data, and interpret data.

  11. Copenhagen and Transactional Interpretations

    NASA Astrophysics Data System (ADS)

    Görnitz, Th.; von Weizsäcker, C. F.

    1988-02-01

    The Copenhagen interpretation (CI) never received an authoritative codification. It was a “minimum semantics” of quantum mechanics. We assume that it expresses a theory identical with the Transactional Interpretation (TI) when the observer is included into the system described by the theory. A theory consists of a mathematical structure with a physical semantics. Now, CI rests on an implicit description of the modes of time which is also presupposed by the Second Law of Thermodynamics. Essential is the futuric meaning of probability as a prediction of a relative frequency. CI can be shown to be fully consistent on this basis. The TI and CI can be translated into each other by a simple “dictionary.” The TI describes all events as CI describes past events; CI calls future events possibilities, which TI treats like facts. All predictions of both interpretations agree; we suppose the difference to be linguistic.

  12. The ADAMS interactive interpreter

    SciTech Connect

    Rietscha, E.R.

    1990-12-17

    The ADAMS (Advanced DAta Management System) project is exploring next generation database technology. Database management does not follow the usual programming paradigm. Instead, the database dictionary provides an additional name space environment that should be interactively created and tested before writing application code. This document describes the implementation and operation of the ADAMS Interpreter, an interactive interface to the ADAMS data dictionary and runtime system. The Interpreter executes individual statements of the ADAMS Interface Language, providing a fast, interactive mechanism to define and access persistent databases. 5 refs.

  13. Extended example of microcomputer-aided interpretation

    SciTech Connect

    Powell, J.A.

    1984-04-01

    Computers are useful to the explorationist not only because they can easily perform computations too lengthy to be performed routinely by hand (e.g., forming a synthetic seismic section for a moderately complex earth model), but also because they can reduce the tedium, time, and error rate of tasks the interpreter now performs (e.g., interpolating and rescaling displays). This paper represents a single example of computer-aided interpretation that illustrates the versatility, speed, and operational simplicity of a stand-alone microcomputer work station. Seiscom Delta's Microseis system performs various tasks required to go from a seismic time section to a final depth map. The system permits picking the section at irregular intervals; thus the user can space data points widely in uninteresting areas, and closely in complex areas. Comments, for example, noting surface features or subtle changes in the character of data, can be entered along with numeric data. The data need not be seismic; the system works equally well with well depths, geochemical data, and radiometric readings. Line typing, isovalue generation, and conversion routines can be used for both seismic and nonseismic data. Data can be entered from displays having various scales and can be easily corrected or supplemented. The user has a wide variety of display options in data selection and format, display scales, and presence of supplemental information.

  14. Interpreting the Constitution.

    ERIC Educational Resources Information Center

    Brennan, William J., Jr.

    1987-01-01

    Discusses constitutional interpretations relating to capital punishment and protection of human dignity. Points out the document's effectiveness in creating a new society by adapting its principles to current problems and needs. Considers two views of the Constitution that lead to controversy over the legitimacy of judicial decisions. (PS)

  15. Deafness and Interpreting.

    ERIC Educational Resources Information Center

    New Jersey State Dept. of Labor, Trenton. Div. of the Deaf.

    This paper explains how the hearing loss of deaf persons affects communication, describes methods deaf individuals use to communicate, and addresses the role of interpreters in the communication process. The volume covers: communication methods such as speechreading or lipreading, written notes, gestures, or sign language (American Sign Language,…

  16. Abstract Interpreters for Free

    NASA Astrophysics Data System (ADS)

    Might, Matthew

    In small-step abstract interpretations, the concrete and abstract semantics bear an uncanny resemblance. In this work, we present an analysis-design methodology that both explains and exploits that resemblance. Specifically, we present a two-step method to convert a small-step concrete semantics into a family of sound, computable abstract interpretations. The first step re-factors the concrete state-space to eliminate recursive structure; this refactoring of the state-space simultaneously determines a store-passing-style transformation on the underlying concrete semantics. The second step uses inference rules to generate an abstract state-space and a Galois connection simultaneously. The Galois connection allows the calculation of the "optimal" abstract interpretation. The two-step process is unambiguous, but nondeterministic: at each step, analysis designers face choices. Some of these choices ultimately influence properties such as flow-, field- and context-sensitivity. Thus, under the method, we can give the emergence of these properties a graph-theoretic characterization. To illustrate the method, we systematically abstract the continuation-passing style lambda calculus to arrive at two distinct families of analyses. The first is the well-known k-CFA family of analyses. The second consists of novel "environment-centric" abstract interpretations, none of which appear in the literature on static analysis of higher-order programs.

  17. The interpretation of fuzziness.

    PubMed

    Wang, P

    1996-01-01

    By analyzing related issues in psychology and linguistics, two basic types of fuzziness can be attributed to similarity and relativity, respectively. In both cases, it is possible to interpret grade of membership as the proportion of positive evidence, so as to treat fuzziness and randomness uniformly. PMID:18263034

  18. Tokens: Facts and Interpretation.

    ERIC Educational Resources Information Center

    Schmandt-Besserat, Denise

    1986-01-01

    Summarizes some of the major pieces of evidence concerning the archeological clay tokens, specifically the technique for their manufacture, their geographic distribution, chronology, and the context in which they are found. Discusses the interpretation of tokens as the first example of visible language, particularly as an antecedent of Sumerian…

  19. Interpretations of Literacy

    ERIC Educational Resources Information Center

    Layton, Lyn; Miller, Carol

    2004-01-01

    The National Literacy Strategy (NLS) was introduced into schools in England in 1998 with the aim of raising the literacy attainments of primary-aged children. The Framework for Teaching the Literacy Hour, a key component of the NLS, proposes an interpretation of literacy that emphasises reading, writing and spelling skills. An investigation of the…

  20. Fractal interpretation of intermittency

    SciTech Connect

    Hwa, R.C.

    1991-12-01

    Implication of intermittency in high-energy collisions is first discussed. Then follows a description of the fractal interpretation of intermittency. A basic quantity with asymptotic fractal behavior is introduced. It is then shown how the factorial moments and the G moments can be expressed in terms of it. The relationship between the intermittency indices and the fractal indices is made explicit.

  1. Psychosemantics and Simultaneous Interpretation.

    ERIC Educational Resources Information Center

    Le Ny, Jean-Francois

    A comprehension model of simultaneous interpretation activity raises three types of problems: structure of semantic information stored in long-term memory, modalities of input processing and specific restrictions due to situation. A useful concept of semantic mnesic structures includes: (1) a componential-predicative lexicon; (2) a propositional…

  2. Interpreting & Biomechanics. PEPNet Tipsheet

    ERIC Educational Resources Information Center

    PEPNet-Northeast, 2001

    2001-01-01

    Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…

  3. INCREASING SCIENTIFIC POWER WITH STATISTICAL POWER

    EPA Science Inventory

    A brief survey of basic ideas in statistical power analysis demonstrates the advantages and ease of using power analysis throughout the design, analysis, and interpretation of research. he power of a statistical test is the probability of rejecting the null hypothesis of the test...

  4. Canadian Population Statistics: A Conversion Guide.

    ERIC Educational Resources Information Center

    Fuller, Michael; McLean, Harvard

    1984-01-01

    Interpreting statistical data from the past is difficult. Because of dissimilarities between past and present, numbers are often difficult to compare. Discusses how teachers can use ratios to convert population statistics to equivalent values so that a comparison can be made. (RM)

  5. Synthesis, characterization and application of water-soluble and easily removable cationic pressure-sensitive adhesives. Quarterly technical report

    SciTech Connect

    1999-09-30

    The Institute studied the adsorption of cationic pressure-sensitive adhesive (PSA) on wood fiber, and the buildup of PSA in a closed water system during paper recycling; the results are presented. Georgia Tech worked to develop an environmentally friendly polymerization process to synthesize a novel re-dispersible PSA by co-polymerizing an oil-soluble monomer (butyl acrylate) and a cationic monomer MAEPTAC; results are presented. At the University of Georgia at Athens the project focused on the synthesis of water-soluble and easily removable cationic polymer PSAs.

  6. Highly concentrated synthesis of copper-zinc-tin-sulfide nanocrystals with easily decomposable capping molecules for printed photovoltaic applications

    NASA Astrophysics Data System (ADS)

    Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho

    2013-10-01

    Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination.Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination. Electronic supplementary information (ESI) available: Experimental methods for CZTS nanocrystal synthesis, device fabrication, and characterization; the size distribution and energy dispersive X-ray (EDX) spectra of the synthesized CZTS nanoparticles; UV-vis spectra of the CZTS films; isothermal analysis of triphenylphosphate (TPP) and oleylamine (OLA); microstructural SEM images of annealed CZTS nanocrystal films. See DOI: 10.1039/c3nr03104g

  7. Statistical ecology comes of age.

    PubMed

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  8. Statistical ecology comes of age

    PubMed Central

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  9. Linking numbers, spin, and statistics of solitons

    NASA Technical Reports Server (NTRS)

    Wilczek, F.; Zee, A.

    1983-01-01

    The spin and statistics of solitons in the (2 + 1)- and (3 + 1)-dimensional nonlinear sigma models is considered. For the (2 + 1)-dimensional case, there is the possibility of fractional spin and exotic statistics; for 3 + 1 dimensions, the usual spin-statistics relation is demonstrated. The linking-number interpretation of the Hopf invariant and the use of suspension considerably simplify the analysis.

  10. A new interpretation of the ?A parameter.

    PubMed

    Carrozzini, B; Cascarano, G L; Giacovazzo, C; Mazzone, A

    2013-07-01

    A new study of the ?A parameter has been undertaken to understand its behaviour when the diffraction amplitude distributions are far from the standard Wilson distributions. The study has led to the formulation of a new statistical interpretation of ?A, expressed in terms of a correlation factor. The new formulas allow a more accurate use of ?A in electron-density modification procedures. PMID:23778097

  11. Semantic interpretation of nominalizations

    SciTech Connect

    Hull, R.D.; Gomez, F.

    1996-12-31

    A computational approach to the semantic interpretation of nominalizations is described. Interpretation of normalizations involves three tasks: deciding whether the normalization is being used in a verbal or non-verbal sense; disambiguating the normalized verb when a verbal sense is used; and determining the fillers of the thematic roles of the verbal concept or predicate of the nominalization. A verbal sense can be recognized by the presence of modifiers that represent the arguments of the verbal concept. It is these same modifiers which provide the semantic clues to disambiguate the normalized verb. In the absence of explicit modifiers, heuristics are used to discriminate between verbal and non-verbal senses. A correspondence between verbs and their nominalizations is exploited so that only a small amount of additional knowledge is needed to handle the nominal form. These methods are tested in the domain of encyclopedic texts and the results are shown.

  12. Development and validation of a quick easily used biochemical assay for evaluating the viability of small immobile arthropods.

    PubMed

    Phillips, Craig B; Iline, Ilia I; Richards, Nicola K; Novoselov, Max; McNeill, Mark R

    2013-10-01

    Quickly, accurately, and easily assessing the efficacy of treatments to control sessile arthropods (e.g., scale insects) and stationary immature life stages (e.g., eggs and pupae) is problematic because it is difficult to tell whether treated organisms are alive or dead. Current approaches usually involve either maintaining organisms in the laboratory to observe them for development, gauging their response to physical stimulation, or assessing morphological characters such as turgidity and color. These can be slow, technically difficult, or subjective, and the validity of methods other than laboratory rearing has seldom been tested. Here, we describe development and validation of a quick easily used biochemical colorimetric assay for measuring the viability of arthropods that is sufficiently sensitive to test even very small organisms such as white fly eggs. The assay was adapted from a technique for staining the enzyme hexokinase to signal the presence of adenosine triphosphate in viable specimens by reducing a tetrazolium salt to formazan. Basic laboratory facilities and skills are required for production of the stain, but no specialist equipment, expertise, or facilities are needed for its use. PMID:24224241

  13. Hemoglobin levels and circulating blasts are two easily evaluable diagnostic parameters highly predictive of leukemic transformation in primary myelofibrosis.

    PubMed

    Rago, Angela; Latagliata, Roberto; Montanaro, Marco; Montefusco, Enrico; Andriani, Alessandro; Crescenzi, Sabrina Leonetti; Mecarocci, Sergio; Spirito, Francesca; Spadea, Antonio; Recine, Umberto; Cicconi, Laura; Avvisati, Giuseppe; Cedrone, Michele; Breccia, Massimo; Porrini, Raffaele; Villivà, Nicoletta; De Gregoris, Cinzia; Alimena, Giuliana; D'Arcangelo, Enzo; Guglielmelli, Paola; Lo-Coco, Francesco; Vannucchi, Alessandro; Cimino, Giuseppe

    2015-03-01

    To predict leukemic transformation (LT), we evaluated easily detectable diagnostic parameters in 338 patients with primary myelofibrosis (PMF) followed in the Latium region (Italy) between 1981 and 2010. Forty patients (11.8%) progressed to leukemia, with a resulting 10-year leukemia-free survival (LFS) rates of 72%. Hb (<10g/dL), and circulating blasts (?1%) were the only two independent prognostic for LT at the multivariate analysis. Two hundred-fifty patients with both the two parameters available were grouped as follows: low risk (none or one factor)=216 patients; high risk (both factors)=31 patients. The median LFS times were 269 and 45 months for the low and high-risk groups, respectively (P<.0001). The LT predictive power of these two parameters was confirmed in an external series of 270 PMF patients from Tuscany, in whom the median LFS was not reached and 61 months for the low and high risk groups, respectively (P<.0001). These results establish anemia and circulating blasts, two easily and universally available parameters, as strong predictors of LT in PMF and may help to improve prognostic stratification of these patients particularly in countries with low resources where more sophisticated molecular testing is unavailable. PMID:25636356

  14. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on. PMID:25090127

  15. Tips for Mental Health Interpretation

    ERIC Educational Resources Information Center

    Whitsett, Margaret

    2008-01-01

    This paper offers tips for working with interpreters in mental health settings. These tips include: (1) Using trained interpreters, not bilingual staff or community members; (2) Explaining "interpreting procedures" to the providers and clients; (3) Addressing the stigma associated with mental health that may influence interpreters; (4) Defining…

  16. Data Interpretation: Using Probability

    ERIC Educational Resources Information Center

    Drummond, Gordon B.; Vowler, Sarah L.

    2011-01-01

    Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…

  17. FIDEA: a server for the functional interpretation of differential expression analysis

    PubMed Central

    D’Andrea, Daniel; Grassi, Luigi; Mazzapioda, Mariagiovanna; Tramontano, Anna

    2013-01-01

    The results of differential expression analyses provide scientists with hundreds to thousands of differentially expressed genes that need to be interpreted in light of the biology of the specific system under study. This requires mapping the genes to functional classifications that can be, for example, the KEGG pathways or InterPro families they belong to, their GO Molecular Function, Biological Process or Cellular Component. A statistically significant overrepresentation of one or more category terms in the set of differentially expressed genes is an essential step for the interpretation of the biological significance of the results. Ideally, the analysis should be performed by scientists who are well acquainted with the biological problem, as they have a wealth of knowledge about the system and can, more easily than a bioinformatician, discover less obvious and, therefore, more interesting relationships. To allow experimentalists to explore their data in an easy and at the same time exhaustive fashion within a single tool and to test their hypothesis quickly and effortlessly, we developed FIDEA. The FIDEA server is located at http://www.biocomputing.it/fidea; it is free and open to all users, and there is no login requirement. PMID:23754850

  18. How to use and interpret hormone ratios.

    PubMed

    Sollberger, Silja; Ehlert, Ulrike

    2016-01-01

    Hormone ratios have become increasingly popular throughout the neuroendocrine literature since they offer a straightforward way to simultaneously analyze the effects of two interdependent hormones. However, the analysis of ratios is associated with statistical and interpretational concerns which have not been sufficiently considered in the context of endocrine research. The aim of this article, therefore, is to demonstrate and discuss these issues, and to suggest suitable ways to address them. In a first step, we use exemplary testosterone and cortisol data to illustrate that one major concern of ratios lies in their distribution and inherent asymmetry. As a consequence, results of parametric statistical analyses are affected by the ultimately arbitrary decision of which way around the ratio is computed (i.e., A/B or B/A). We suggest the use of non-parametric methods as well as the log-transformation of hormone ratios as appropriate methods to deal with these statistical problems. However, in a second step, we also discuss the complicated interpretation of ratios, and propose moderation analysis as an alternative and oftentimes more insightful approach to ratio analysis. In conclusion, we suggest that researchers carefully consider which statistical approach is best suited to investigate reciprocal hormone effects. With regard to the hormone ratio method, further research is needed to specify what exactly this index reflects on the biological level and in which cases it is a meaningful variable to analyze. PMID:26521052

  19. Interpreting TCLP results -- A simplified approach

    SciTech Connect

    Rigo, H.G.

    1996-12-31

    On May 2, 1994, the Supreme Court decided in the City of Chicago v. EDF that residues generated at municipal waste combustors [MWC] were not exempt from the requirements of RCRA Section 3001(i). The ruling effectively states that contrary to previous EPA guidance and determinations, combustion residues are only exempt from regulation if they do not exhibit hazardous characteristics. Correctly concluding that a residue is RCRA nonhazardous requires a finding that the 90% upper confidence limit [UCL] for the average TCLP contaminant concentration is less than the regulatory threshold. A correct characterization requires proper analysis of representative samples and statistically valid TCLP data interpretation. Statistically valid TCLP data interpretation is very tedious when the data is not normally distributed, as assumed in EPA`s draft guidance for Sampling and Analysis of Municipal Refuse Incinerator Ash (1994), but warned against in SW-846, Test methods for Evaluating Solid Waste (1986). Tremendous statistical simplification can be achieved if the rank ordered result which bounds the UCL is used to make an initial determinant. If this value is less than the regulatory threshold, the residue is non-hazardous. If it is larger, then a representative distribution must be found and the inherently tighter UCL calculated using distributional assumptions in accordance with good statistical practice and 40 CFR 261 referenced SW-846.

  20. A comprehensive risk assessment for tephra accumulation using easily accessible data: the example of Cotopaxi volcano (Ecuador)

    NASA Astrophysics Data System (ADS)

    Biass, Sébastien; Frischknecht, Corine; Dell'Oro, Luca; Senegas, Olivier; Bonadonna, Costanza

    2010-05-01

    In order to answer the needs of contingency planning, we present a GIS-based method for risk assessment of tephra deposits, which is flexible enough to work with datasets of variable precision and resolution depending on data availabilty. Due to the constant increase of population density around volcanoes and the large dispersal of tephra from volcanic plumes, a wide range of threats such as roof collapses, destruction of crops, blockage of vital lifelines and health problems concern even remote communities. In the field of disaster management, there is a general agreement that a global and incomplete method, subject to revision and improvements, is better than no information at all. In this framework, our method is able to provide fast and rough insights on possible eruptive scenarios and their potential consequences on surrounding populations with only few available data, which can easily be refined later. Therefore, the knowledge of both the expected hazard (frequency and magnitude) and the vulnerability of elements at risk are required by planners in order to produce efficient emergency planning prior to a crisis. The Cotopaxi volcano, one of Ecuador's most active volcanoes, was used to develop and test this method. Cotopaxi volcano is located 60 km south of Quito and threatens a highly populated valley. Based on field data, historical reports and the Smithsonian catalogue, our hazard assessment was carried out using the numerical model TEPHRA2. We first applied a deterministic approach that evolved towards a fully probabilistic method in order to account for the most likely eruptive scenarios as well as the variability of atmospheric conditions. In parallel, we carried out a vulnerability assessment of the physical (crops and roofs), social (populations) and systemic elements-at-risk by using mainly free and easily accessible data. Both hazard and vulnerability assessments were compiled with GIS tools to draw comprehensive and tangible thematic risk maps, providing thus the first necessary step for efficient preparedness plannings.

  1. LACIE analyst interpretation keys

    NASA Technical Reports Server (NTRS)

    Baron, J. G.; Payne, R. W.; Palmer, W. F. (principal investigators)

    1979-01-01

    Two interpretation aids, 'The Image Analysis Guide for Wheat/Small Grains Inventories' and 'The United States and Canadian Great Plains Regional Keys', were developed during LACIE phase 2 and implemented during phase 3 in order to provide analysts with a better understanding of the expected ranges in color variation of signatures for individual biostages and of the temporal sequences of LANDSAT signatures. The keys were tested using operational LACIE data, and the results demonstrate that their use provides improved labeling accuracy in all analyst experience groupings, in all geographic areas within the U.S. Great Plains, and during all periods of crop development.

  2. Cosmetic Plastic Surgery Statistics

    MedlinePLUS

    2014 Cosmetic Plastic Surgery Statistics Cosmetic Procedure Trends 2014 Plastic Surgery Statistics Report Please credit the AMERICAN SOCIETY OF PLASTIC SURGEONS when citing statistical data or using ...

  3. Analyzing spike trains with circular statistics

    NASA Astrophysics Data System (ADS)

    Takeshita, Daisuke; Gale, John T.; Montgomery, Erwin B.; Bahar, Sonya; Moss, Frank

    2009-05-01

    In neuroscience, specifically electrophysiology, it is common to replace a measured sequence of action potentials or spike trains with delta functions prior to analysis. We apply a method called circular statistics to a time series of delta functions and show that the method is equivalent to the power spectrum. This technique allows us to easily visualize the idea of the power spectrum of spike trains and easily reveals oscillatory and stochastic behavior. We provide several illustrations of the method and an example suitable for students, and suggest that the method might be useful for courses in introductory biophysics and neuroscience.

  4. Statistical Modeling of SAR Images: A Survey

    PubMed Central

    Gao, Gui

    2010-01-01

    Statistical modeling is essential to SAR (Synthetic Aperture Radar) image interpretation. It aims to describe SAR images through statistical methods and reveal the characteristics of these images. Moreover, statistical modeling can provide a technical support for a comprehensive understanding of terrain scattering mechanism, which helps to develop algorithms for effective image interpretation and creditable image simulation. Numerous statistical models have been developed to describe SAR image data, and the purpose of this paper is to categorize and evaluate these models. We first summarize the development history and the current researching state of statistical modeling, then different SAR image models developed from the product model are mainly discussed in detail. Relevant issues are also discussed. Several promising directions for future research are concluded at last. PMID:22315568

  5. Physical interpretation of antigravity

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; James, Albin

    2016-02-01

    Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl-invariant Standard Model coupled to general relativity (SM +GR ), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated, but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and out signals that interact with the antigravity region. This is no different from a spacetime black box for which the information about its interior is encoded in scattering amplitudes for in/out states at its exterior. Through examples we show that negative kinetic energy in antigravity presents no problems of principles but is an interesting topic for physical investigations of fundamental significance.

  6. Structural interpretation of seismic data and inherent uncertainties

    NASA Astrophysics Data System (ADS)

    Bond, Clare

    2013-04-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.

  7. Model averaging methods to merge operational statistical and dynamic seasonal streamflow forecasts in Australia

    NASA Astrophysics Data System (ADS)

    Schepen, Andrew; Wang, Q. J.

    2015-03-01

    The Australian Bureau of Meteorology produces statistical and dynamic seasonal streamflow forecasts. The statistical and dynamic forecasts are similarly reliable in ensemble spread; however, skill varies by catchment and season. Therefore, it may be possible to optimize forecasting skill by weighting and merging statistical and dynamic forecasts. Two model averaging methods are evaluated for merging forecasts for 12 locations. The first method, Bayesian model averaging (BMA), applies averaging to forecast probability densities (and thus cumulative probabilities) for a given forecast variable value. The second method, quantile model averaging (QMA), applies averaging to forecast variable values (quantiles) for a given cumulative probability (quantile fraction). BMA and QMA are found to perform similarly in terms of overall skill scores and reliability in ensemble spread. Both methods improve forecast skill across catchments and seasons. However, when both the statistical and dynamical forecasting approaches are skillful but produce, on special occasions, very different event forecasts, the BMA merged forecasts for these events can have unusually wide and bimodal distributions. In contrast, the distributions of the QMA merged forecasts for these events are narrower, unimodal and generally more smoothly shaped, and are potentially more easily communicated to and interpreted by the forecast users. Such special occasions are found to be rare. However, every forecast counts in an operational service, and therefore the occasional contrast in merged forecasts between the two methods may be more significant than the indifference shown by the overall skill and reliability performance.

  8. Statistical analysis of arthroplasty data

    PubMed Central

    2011-01-01

    It is envisaged that guidelines for statistical analysis and presentation of results will improve the quality and value of research. The Nordic Arthroplasty Register Association (NARA) has therefore developed guidelines for the statistical analysis of arthroplasty register data. The guidelines are divided into two parts, one with an introduction and a discussion of the background to the guidelines (Ranstam et al. 2011a, see pages x-y in this issue), and this one with a more technical statistical discussion on how specific problems can be handled. This second part contains (1) recommendations for the interpretation of methods used to calculate survival, (2) recommendations on howto deal with bilateral observations, and (3) a discussion of problems and pitfalls associated with analysis of factors that influence survival or comparisons between outcomes extracted from different hospitals. PMID:21619500

  9. Enhancing the Teaching of Statistics: Portfolio Theory, an Application of Statistics in Finance

    ERIC Educational Resources Information Center

    Christou, Nicolas

    2008-01-01

    In this paper we present an application of statistics using real stock market data. Most, if not all, students have some familiarity with the stock market (or at least they have heard about it) and therefore can understand the problem easily. It is the real data analysis that students find interesting. Here we explore the building of efficient…

  10. Enhancing the Teaching of Statistics: Portfolio Theory, an Application of Statistics in Finance

    ERIC Educational Resources Information Center

    Christou, Nicolas

    2008-01-01

    In this paper we present an application of statistics using real stock market data. Most, if not all, students have some familiarity with the stock market (or at least they have heard about it) and therefore can understand the problem easily. It is the real data analysis that students find interesting. Here we explore the building of efficient…

  11. Easily accessible polymer additives for tuning the crystal-growth of perovskite thin-films for highly efficient solar cells

    NASA Astrophysics Data System (ADS)

    Dong, Qingqing; Wang, Zhaowei; Zhang, Kaicheng; Yu, Hao; Huang, Peng; Liu, Xiaodong; Zhou, Yi; Chen, Ning; Song, Bo

    2016-03-01

    For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs.For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs. Electronic supplementary information (ESI) available: J-V curves & characteristics of Pero-SCs, UV-vis spectra and AFM images. See DOI: 10.1039/c6nr00206d

  12. Reverse Causation and the Transactional Interpretation of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Cramer, John G.

    2006-10-01

    In the first part of the paper we present the transactional interpretation of quantum mechanics, a method of viewing the formalism of quantum mechanics that provides a way of visualizing quantum events and experiments. In the second part, we present an EPR gedankenexperiment that appears to lead to observer-level reverse causation. A transactional analysis of the experiment is presented. It easily accounts for the reported observations but does not reveal any barriers to its modification for reverse causation.

  13. A Method to Estimate the Bureau of Labor Statistics Family Budgets for All Standard Metropolitan Statistical Areas.

    ERIC Educational Resources Information Center

    Cobas, Jose A.

    1978-01-01

    Since the Bureau of Labor's cost-of-living statistics are available for only 40 standard metropolitan statistical areas (SMSAs) in four regions, the author presents a method to estimate family budgets for all SMSAs. Data for the calculations are easily accessible in Census and other government publications. (Author/KC)

  14. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    SciTech Connect

    Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  15. Improve MWD data interpretation

    SciTech Connect

    Santley, D.J.; Ardrey, W.E.

    1987-01-01

    This article reports that measurement-while-drilling (MWD) technology is being used today in a broad range of real-time drilling applications. In its infancy, MWD was limited to providing directional survey and steering information. Today, the addition of formation sensors (resistivity, gamma) and drilling efficiency sensors (WOB, torque) has made MWD a much more useful drilling decision tool. In the process, the desirability of combining downhole MWD data with powerful analytical software and interpretive techniques has been recognized by both operators and service companies. However, the usual form in which MWD and wellsite analytical capabilities are combined leaves much to be desired. The most common approach is to incorporate MWD with large-scale computerized mud logging (CML) systems. Essentially, MWD decoding and display equipment is added to existing full-blown CML surface units.

  16. Interpretation and application of borehole televiewer surveys

    SciTech Connect

    Taylor, T.J.

    1983-01-01

    A borehole televiewer log is comparable to a picture of a continuous core and may yield even more information since it is a picture of the cores host environment; i.e., the inside of the borehole as it exists in the subsurface. Important relationships are preserved which can be lost when cores are brought to the surface. Fractures, bedding planes, vugs, and lithology changes are identifiable on borehole televiewer logs. The travel time of the signal from the sonde to the borehole wall and back to the sonde recently has been used to form a second log: the transit time log. Interpretation problems due to noncircular borehole and eccentered logging sondes are easily overcome using the combination of amplitude and transit time logs. Examples are given to demonstrate potential use.

  17. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  18. Easily accessible polymer additives for tuning the crystal-growth of perovskite thin-films for highly efficient solar cells.

    PubMed

    Dong, Qingqing; Wang, Zhaowei; Zhang, Kaicheng; Yu, Hao; Huang, Peng; Liu, Xiaodong; Zhou, Yi; Chen, Ning; Song, Bo

    2016-03-01

    For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs. PMID:26887633

  19. Impact of an easily reducible disulfide bond on the oxidative folding rate of multi-disulfide-containing proteins.

    PubMed

    Leung, H J; Xu, G; Narayan, M; Scheraga, H A

    2005-01-01

    The burial of native disulfide bonds, formed within stable structure in the regeneration of multi-disulfide-containing proteins from their fully reduced states, is a key step in the folding process, as the burial greatly accelerates the oxidative folding rate of the protein by sequestering the native disulfide bonds from thiol-disulfide exchange reactions. Nevertheless, several proteins retain solvent-exposed disulfide bonds in their native structures. Here, we have examined the impact of an easily reducible native disulfide bond on the oxidative folding rate of a protein. Our studies reveal that the susceptibility of the (40-95) disulfide bond of Y92G bovine pancreatic ribonuclease A (RNase A) to reduction results in a reduced rate of oxidative regeneration, compared with wild-type RNase A. In the native state of RNase A, Tyr 92 lies atop its (40-95) disulfide bond, effectively shielding this bond from the reducing agent, thereby promoting protein oxidative regeneration. Our work sheds light on the unique contribution of a local structural element in promoting the oxidative folding of a multi-disulfide-containing protein. PMID:15686534

  20. Changing Preservice Science Teachers' Views of Nature of Science: Why Some Conceptions May be More Easily Altered than Others

    NASA Astrophysics Data System (ADS)

    Mesci, Gunkut; Schwartz, Renee'S.

    2016-02-01

    The purpose of this study was to assess preservice teachers' views of Nature of Science (NOS), identify aspects that were challenging for conceptual change, and explore reasons why. This study particularly focused on why and how some concepts of NOS may be more easily altered than others. Fourteen preservice science teachers enrolled in a NOS and Science Inquiry course participated in this study. Data were collected by using a pre/post format with the Views of Nature of Science questionnaire (VNOS-270), the Views of Scientific Inquiry questionnaire (VOSI-270), follow-up interviews, and classroom artifacts. The results indicated that most students initially held naïve views about certain aspects of NOS like tentativeness and subjectivity. By the end of the semester, almost all students dramatically improved their understanding about almost all aspects of NOS. However, several students still struggled with certain aspects like the differences between scientific theory and law, tentativeness, and socio-cultural embeddedness. Results suggested that instructional, motivational, and socio-cultural factors may influence if and how students changed their views about targeted NOS aspects. Students thought that classroom activities, discussions, and readings were most helpful to improve their views about NOS. The findings from the research have the potential to translate as practical advice for teachers, science educators, and future researchers.

  1. Easily implementable field programmable gate array-based adaptive optics system with state-space multichannel control.

    PubMed

    Chang, Chia-Yuan; Ke, Bo-Ting; Su, Hung-Wei; Yen, Wei-Chung; Chen, Shean-Jen

    2013-09-01

    In this paper, an easily implementable adaptive optics system (AOS) based on a real-time field programmable gate array (FPGA) platform with state-space multichannel control programmed by LabVIEW has been developed, and also integrated into a laser focusing system successfully. To meet the requirements of simple programming configuration and easy integration with other devices, the FPGA-based AOS introduces a standard operation procedure including AOS identification, computation, and operation. The overall system with a 32-channel driving signal for a deformable mirror (DM) as input and a Zernike polynomial via a lab-made Shack-Hartmann wavefront sensor (SHWS) as output is optimally identified to construct a multichannel state-space model off-line. In real-time operation, the FPGA platform first calculates the Zernike polynomial of the optical wavefront measured from the SHWS as the feedback signal. Then, a state-space multichannel controller according to the feedback signal and the identified model is designed and implemented in the FPGA to drive the DM for phase distortion compensation. The current FPGA-based AOS is capable of suppressing low-frequency thermal disturbances with a steady-state phase error of less than 0.1 ? within less than 10 time steps when the control loop is operated at a frequency of 30 Hz. PMID:24089871

  2. Interpreting Title I Evaluation Results.

    ERIC Educational Resources Information Center

    Davis, Ann E.; And Others

    This is a workshop simulation and interpretation guide designed for Title I teachers and district personnel. The participants should have some experience with the norm-referenced evaluation model. They learn to interpret normal curve equivalents (NCE) and NCE gains. Participants are led through an interpretive hierarchy from simple, descriptive…

  3. A History of Oral Interpretation.

    ERIC Educational Resources Information Center

    Bahn, Eugene; Bahn, Margaret L.

    This historical account of the oral interpretation of literature establishes a chain of events comprehending 25 centuries of verbal tradition from the Homeric Age through 20th Century America. It deals in each era with the viewpoints and contributions of major historical figures to oral interpretation, as well as with oral interpretation's…

  4. How To Calculate Statistics. Program Evaluation Kit, 7.

    ERIC Educational Resources Information Center

    Fitz-Gibbon, Carol Taylor; Morris, Lynn Lyons

    The statistical methods presented in this workbook focus on the display of central tendency, statistical tests, and correlation. Chapters 2, 3, and 4 consist of worksheets, each of which provides step-by-step instructions for calculating and interpreting a particular statistical method. The introduction to each worksheet explains how and when to…

  5. Quantifying the power of multiple event interpretations

    NASA Astrophysics Data System (ADS)

    Chien, Yang-Ting; Farhi, David; Krohn, David; Marantan, Andrew; Mateos, David Lopez; Schwartz, Matthew

    2014-12-01

    A number of methods have been proposed recently which exploit multiple highly-correlated interpretations of events, or of jets within an event. For example, Qjets reclusters a jet multiple times and telescoping jets uses multiple cone sizes. Previous work has employed these methods in pseudo-experimental analyses and found that, with a simplified statistical treatment, they give sizable improvements over traditional methods. In this paper, the improvement gain from multiple event interpretations is explored with methods much closer to those used in real experiments. To this end, we derive and study a generalized extended maximum likelihood procedure, and find that using multiple jet radii can provide substantial benefit over a single radius in fitting procedures. Another major concern we address is that multiple event interpretations might be exploiting similar information to that already present in the standard kinematic variables. We perform multivariate analyses (boosted decision trees) on a set of standard kinematic variables, a single observable computed with several different cone sizes, and both sets combined. We find that using multiple radii is still helpful even on top of standard kinematic variables (providing a 12% improvement at low p T and 20% at high p T ). These results suggest that including multiple event interpretations in a realistic search for Higgs to would give additional sensitivity over traditional approaches.

  6. Transport of sewage molecular markers through saturated soil column and effect of easily biodegradable primary substrate on their removal.

    PubMed

    Foolad, Mahsa; Ong, Say Leong; Hu, Jiangyong

    2015-11-01

    Pharmaceutical and personal care products (PPCPs) and artificial sweeteners (ASs) are emerging organic contaminants (EOCs) in the aquatic environment. The presence of PPCPs and ASs in water bodies has an ecologic potential risk and health concern. Therefore, it is needed to detect the pollution sources by understanding the transport behavior of sewage molecular markers in a subsurface area. The aim of this study was to evaluate transport of nine selected molecular markers through saturated soil column experiments. The selected sewage molecular markers in this study were six PPCPs including acetaminophen (ACT), carbamazepine (CBZ), caffeine (CF), crotamiton (CTMT), diethyltoluamide (DEET), salicylic acid (SA) and three ASs including acesulfame (ACF), cyclamate (CYC), and saccharine (SAC). Results confirmed that ACF, CBZ, CTMT, CYC and SAC were suitable to be used as sewage molecular markers since they were almost stable against sorption and biodegradation process during soil column experiments. In contrast, transport of ACT, CF and DEET were limited by both sorption and biodegradation processes and 100% removal efficiency was achieved in the biotic column. Moreover, in this study the effect of different acetate concentration (0-100mg/L) as an easily biodegradable primary substrate on a removal of PPCPs and ASs was also studied. Results showed a negative correlation (r(2)>0.75) between the removal of some selected sewage chemical markers including ACF, CF, ACT, CYC, SAC and acetate concentration. CTMT also decreased with the addition of acetate, but increasing acetate concentration did not affect on its removal. CBZ and DEET removal were not dependent on the presence of acetate. PMID:26210019

  7. Input of easily available organic C and N stimulates microbial decomposition of soil organic matter in arctic permafrost soil.

    PubMed

    Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J Eloy; Barsukov, Pavel; Bárta, Jiří; Capek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Santrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas

    2014-08-01

    Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM ("priming effect"). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze-thaw processes) to additions of (13)C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant productivity, can change the decomposition of SOM stored in deeper layers of permafrost soils, with possible repercussions on the global climate. PMID:25089062

  8. Input of easily available organic C and N stimulates microbial decomposition of soil organic matter in arctic permafrost soil

    PubMed Central

    Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J. Eloy; Barsukov, Pavel; Bárta, Jiří; Čapek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Šantrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas

    2014-01-01

    Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM (“priming effect”). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze–thaw processes) to additions of 13C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant productivity, can change the decomposition of SOM stored in deeper layers of permafrost soils, with possible repercussions on the global climate. PMID:25089062

  9. Statistics Poker: Reinforcing Basic Statistical Concepts

    ERIC Educational Resources Information Center

    Leech, Nancy L.

    2008-01-01

    Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…

  10. Neurological imaging: statistics behind the pictures

    PubMed Central

    Dinov, Ivo D

    2011-01-01

    Neurological imaging represents a powerful paradigm for investigation of brain structure, physiology and function across different scales. The diverse phenotypes and significant normal and pathological brain variability demand reliable and efficient statistical methodologies to model, analyze and interpret raw neurological images and derived geometric information from these images. The validity, reproducibility and power of any statistical brain map require appropriate inference on large cohorts, significant community validation, and multidisciplinary collaborations between physicians, engineers and statisticians. PMID:22180753

  11. Applications of Statistical Tests in Hand Surgery

    PubMed Central

    Song, Jae W.; Haas, Ann; Chung, Kevin C.

    2015-01-01

    During the nineteenth century, with the emergence of public health as a goal to improve hygiene and conditions of the poor, statistics established itself as a distinct scientific field important for critically interpreting studies of public health concerns. During the twentieth century, statistics began to evolve mathematically and methodologically with hypothesis testing and experimental design. Today, the design of medical experiments centers around clinical trials and observational studies, and with the use of statistics, the collected data are summarized, weighed, and presented to direct both physicians and the public towards Evidence-Based Medicine. Having a basic understanding of statistics is mandatory in evaluating the validity of published literature and applying it to patient care. In this review, we aim to apply a practical approach in discussing basic statistical tests by providing a guide to choosing the correct statistical test along with examples relevant to hand surgery research. PMID:19969193

  12. Mental Illness Statistics

    MedlinePLUS

    ... Cost Global More Prevalence Disability Suicide Cost Global Statistics Understanding the scope of mental illnesses and their ... those affected receive treatment. The information on these statistics pages includes the best statistics currently available on ...

  13. Pornography and rape: theory and practice? Evidence from crime data in four countries where pornography is easily available.

    PubMed

    Kutchinsky, B

    1991-01-01

    We have looked at the empirical evidence of the well-known feminist dictum: "pornography is the theory--rape is the practice" (Morgan, 1980). While earlier research, notably that generated by the U.S. Commission on Obscenity and Pornography (1970) had found no evidence of a causal link between pornography and rape, a new generation of behavioral scientists have, for more than a decade, made considerable effort to prove such a connection, especially as far as "aggressive pornography" is concerned. The first part of the article examines and discusses the findings of this new research. A number of laboratory experiments have been conducted, much akin to the types of experiments developed by researchers of the effects of nonsexual media violence. As in the latter, a certain degree of increased "aggressiveness" has been found under certain circumstances, but to extrapolate from such laboratory effects to the commission of rape in real life is dubious. Studies of rapists' and nonrapists' immediate sexual reactions to presentations of pornography showed generally greater arousal to non-violent scenes, and no difference can be found in this regard between convicted rapists, nonsexual criminals and noncriminal males. In the second part of the paper an attempt was made to study the necessary precondition for a substantial causal relationship between the availability of pornography, including aggressive pornography, and rape--namely, that obviously increased availability of such material was followed by an increase in cases of reported rape. The development of rape and attempted rape during the period 1964-1984 was studied in four countries: the U.S.A., Denmark, Sweden and West Germany. In all four countries there is clear and undisputed evidence that during this period the availability of various forms of pictorial pornography including violent/dominant varieties (in the form of picture magazines, and films/videos used at home or shown in arcades or cinemas) has developed from extreme scarcity to relative abundance. If (violent) pornography causes rape, this exceptional development in the availability of (violent) pornography should definitely somehow influence the rape statistics. Since, however, the rape figures could not simply be expected to remain steady during the period in question (when it is well known that most other crimes increased considerably), the development of rape rates was compared with that of non-sexual violent offences and nonviolent sexual offences (in so far as available statistics permitted). The results showed that in none of the countries did rape increase more than nonsexual violent crimes. This finding in itself would seem sufficient to discard the hypothesis that pornography causes rape.(ABSTRACT TRUNCATED AT 400 WORDS) PMID:2032762

  14. Fundamentals of interpretation in echocardiography

    SciTech Connect

    Harrigan, P.; Lee, R.M.

    1985-01-01

    This illustrated book provides familiarity with the many clinical, physical, and electronic factors that bear on echocardiographic interpretation. Physical and clinical principles are integrated with considerations of anatomy and physiology to address interpretive problems. This approach yields, for example, sections on the physics and electronics of M-mode, cross sectional, and Doppler systems which are informal, full of echocardiagrams, virtually devoid of mathematics, and rigorously related to common issues faced by echocardiograph interpreters.

  15. Statistics From Whom?

    ERIC Educational Resources Information Center

    Caine, Robert; And Others

    1978-01-01

    Presents arguments for offering introductory statistics courses to undergraduate sociology majors taught within departments of sociology rather than using statistics courses taught by other departments. (Author)

  16. EXPERIMENTAL DESIGN: STATISTICAL CONSIDERATIONS AND ANALYSIS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this book chapter, information on how field experiments in invertebrate pathology are designed and the data collected, analyzed, and interpreted is presented. The practical and statistical issues that need to be considered and the rationale and assumptions behind different designs or procedures ...

  17. Statistics of sexual size dimorphism.

    PubMed

    Smith, R J

    1999-04-01

    In comparative studies of sexual size dimorphism (SSD), the methods used to quantify dimorphism are controversial. SSD is commonly expressed as a ratio between species mean values of males and females, such as M/F or (M-F)/([M+F]/2), but a number of investigators have suggested that ratios should not be used, mainly because their distributions usually violate the assumptions of parametric statistical tests, or because they lead to spurious relationships that invalidate the interpretation and statistical significance of regressions and correlations. As an alternative to ratios, the comparative study of SSD can be conducted by a combination of regression with sex-specific data and residuals from this regression. Twenty-five data sets were selected from the literature and used to duplicate a variety of statistical procedures commonly employed in studies of SSD. All analyses were repeated with five different ratios and with methods that avoid the calculation of any ratios. These data and a review of the statistical properties of ratios and residuals indicate that: (1) most of the ratios used in the SSD literature are unnecessary, and several commonly used ratios are statistically inferior to others. Only two ratios are needed, one on a logarithmic scale and one on a linear scale; (2) there is no problem with spurious correlation or non-normality when ratios are used in several types of statistical procedures commonly employed in studies of SSD; (3) residuals cannot replace ratios for the evaluation of many questions regarding the pattern of SSD among species; and (4) residuals usually are used incorrectly, leading to misspecified regression equations. Most of the questions for which residuals are used should be addressed by multiple regression. These results apply to studies using comparative methods with or without adjustments for phylogenetic effects. PMID:10208795

  18. Museum Docents' Understanding of Interpretation

    ERIC Educational Resources Information Center

    Neill, Amanda C.

    2010-01-01

    The purpose of this qualitative research study was to explore docents' perceptions of their interpretive role in art museums and determine how those perceptions shape docents' practice. The objective was to better understand how docents conceive of their role and what shapes the interpretation they give on tours to the public. The conceptual…

  19. Interpreting Recoil for Undergraduate Students

    ERIC Educational Resources Information Center

    Elsayed, Tarek A.

    2012-01-01

    The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is…

  20. Interpreting Recoil for Undergraduate Students

    ERIC Educational Resources Information Center

    Elsayed, Tarek A.

    2012-01-01

    The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is…

  1. Remote sensing and image interpretation

    NASA Technical Reports Server (NTRS)

    Lillesand, T. M.; Kiefer, R. W. (Principal Investigator)

    1979-01-01

    A textbook prepared primarily for use in introductory courses in remote sensing is presented. Topics covered include concepts and foundations of remote sensing; elements of photographic systems; introduction to airphoto interpretation; airphoto interpretation for terrain evaluation; photogrammetry; radiometric characteristics of aerial photographs; aerial thermography; multispectral scanning and spectral pattern recognition; microwave sensing; and remote sensing from space.

  2. The Medical Interpreter Training Project.

    ERIC Educational Resources Information Center

    Avery, Maria-Paz Beltran

    The Medical Interpreter Training Project, created as a collaborative effort of Northern Essex Community College (Massachusetts), private businesses, and medical care providers in Massachusetts, developed a 28-credit, competency-based certificate program to prepare bilingual adults to work as medical interpreters in a range of health care settings.…

  3. Interpretation Tasks for Grammar Teaching.

    ERIC Educational Resources Information Center

    Ellis, Rod

    1995-01-01

    The traditional approach to grammar teaching provides learners with opportunities to produce specific grammatical structures. This article explores an alternative approach, one based on interpreting input. The rationale for the approach is discussed, as are the principles for designing interpretation tasks for grammar teaching. (Contains 35…

  4. Geological interpretation of potential fields

    NASA Astrophysics Data System (ADS)

    Starostenko, V. I.

    This volume contains papers from the Third All-Union School-Seminar on the Geological Interpretation of Gravitational and Magnetic Fields (Yalta, December 1980). Particular consideration is given to such topics as a method for constructing density models of the tectonosphere of platform and active regions; the interpretation of the gravitational field of the basic structures of the world ocean; the current status of gravitational surveying; and an algorithm for the regional interpretation of gravimetry data. Also considered are the inverse problem of magnetic surveying; the role of viscous magnetization in the formation of magnetic anomalies of the continental crust; calculation of mechanical stresses in the lithosphere on the basis of gravitational data; the deep structure of the Siberian platform as interpreted on the basis of gravimeter and magnetometer data; the equivalence of density models of deep structures; and a systems approach to the interpretation of gravimetry data. No individual items are abstracted in this volume

  5. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  6. Statistical templates for visual search

    PubMed Central

    Ackermann, John F.; Landy, Michael S.

    2014-01-01

    How do we find a target embedded in a scene? Within the framework of signal detection theory, this task is carried out by comparing each region of the scene with a “template,” i.e., an internal representation of the search target. Here we ask what form this representation takes when the search target is a complex image with uncertain orientation. We examine three possible representations. The first is the matched filter. Such a representation cannot account for the ease with which humans can find a complex search target that is rotated relative to the template. A second representation attempts to deal with this by estimating the relative orientation of target and match and rotating the intensity-based template. No intensity-based template, however, can account for the ability to easily locate targets that are defined categorically and not in terms of a specific arrangement of pixels. Thus, we define a third template that represents the target in terms of image statistics rather than pixel intensities. Subjects performed a two-alternative, forced-choice search task in which they had to localize an image that matched a previously viewed target. Target images were texture patches. In one condition, match images were the same image as the target and distractors were a different image of the same textured material. In the second condition, the match image was of the same texture as the target (but different pixels) and the distractor was an image of a different texture. Match and distractor stimuli were randomly rotated relative to the target. We compared human performance to pixel-based, pixel-based with rotation, and statistic-based search models. The statistic-based search model was most successful at matching human performance. We conclude that humans use summary statistics to search for complex visual targets. PMID:24627458

  7. Crunching Numbers: What Cancer Screening Statistics Really Tell Us

    Cancer.gov

    Cancer screening studies have shown that more screening does not necessarily translate into fewer cancer deaths. This article explains how to interpret the statistics used to describe the results of screening studies.

  8. Statistics and Probability for Gifted Middle School Students.

    ERIC Educational Resources Information Center

    Shulte, Albert P.

    1984-01-01

    Topics from statistics and probability are presented as appropriate for gifted middle school students. Topics stress simple ways to display, compare, and interpret data. Probability methods included examining a game for fairness. (Author/CL)

  9. USER'S GUIDE: CHROMOSOMAL ABERRATION DATA ANALYSIS AND INTERPRETATION SYSTEM

    EPA Science Inventory

    This user's manual provides guidance to researchers and the regulatory community for interacting with a data analysis and statistical interpretation system, designated as CA. A is dedicated to the in vivo chromosome aberration assay, a routinely used genetic toxicology assay for ...

  10. Interpreter services in pediatric nursing.

    PubMed

    Lehna, Carlee

    2005-01-01

    A critical part of every encounter between a pediatric nurse and a patient is obtaining accurate patient information. Unique obstacles are encountered when patients and their families have little or no understanding of the English language. Federal and state laws require health care systems that receive governmental funds to provide full language access to services. Both legal and ethical issues can arise when caring for non-English-speaking patients. Often, obtaining accurate patient information and a fully informed consent cannot be done without the use of an interpreter. The interpreter informs the family of all the risks and benefits of a specific avenue of care. When inappropriate interpreter services are used, such as when children in the family or other family members act as interpreters, concerns about accuracy, confidentiality, cultural congruency, and other issues may arise. The purpose of this article is to: (a) explore principles related to the use of medical interpreters, (b) examine different models of interpreter services, and (c) identify available resources to assist providers in accessing interpreter services (e.g., books, online resources, articles, and videos). The case study format will be used to illustrate key points. PMID:16229125

  11. Philosophical perspectives on quantum chaos: Models and interpretations

    NASA Astrophysics Data System (ADS)

    Bokulich, Alisa Nicole

    2001-09-01

    The problem of quantum chaos is a special case of the larger problem of understanding how the classical world emerges from quantum mechanics. While we have learned that chaos is pervasive in classical systems, it appears to be almost entirely absent in quantum systems. The aim of this dissertation is to determine what implications the interpretation of quantum mechanics has for attempts to explain the emergence of classical chaos. There are three interpretations of quantum mechanics that have set out programs for solving the problem of quantum chaos: the standard interpretation, the statistical interpretation, and the deBroglie-Bohm causal interpretation. One of the main conclusions of this dissertation is that an interpretation alone is insufficient for solving the problem of quantum chaos and that the phenomenon of decoherence must be taken into account. Although a completely satisfactory solution of the problem of quantum chaos is still outstanding, I argue that the deBroglie-Bohm interpretation with the help of decoherence outlines the most promising research program to pursue. In addition to making a contribution to the debate in the philosophy of physics concerning the interpretation of quantum mechanics, this dissertation reveals two important methodological lessons for the philosophy of science. First, issues of reductionism and intertheoretic relations cannot be divorced from questions concerning the interpretation of the theories involved. Not only is the exploration of intertheoretic relations a central part of the articulation and interpretation of an individual theory, but the very terms used to discuss intertheoretic relations, such as `state' and `classical limit', are themselves defined by particular interpretations of the theory. The second lesson that emerges is that, when it comes to characterizing the relationship between classical chaos and quantum mechanics, the traditional approaches to intertheoretic relations, namely reductionism and theoretical pluralism, are inadequate. The fruitful ways in which models have been used in quantum chaos research point to the need for a new framework for addressing intertheoretic relations that focuses on models rather than laws.

  12. ENVIRONMENTAL PHOTOGRAPHIC INTERPRETATION CENTER (EPIC)

    EPA Science Inventory

    The Environmental Sciences Division (ESD) in the National Exposure Research Laboratory (NERL) of the Office of Research and Development provides remote sensing technical support including aerial photograph acquisition and interpretation to the EPA Program Offices, ORD Laboratorie...

  13. INTERPRETATION OF ENVIRONMENTAL ASSESSMENT DATA

    EPA Science Inventory

    The report describes preliminary attempts to formulate viable models for interpreting environmental assessment data. The models are evaluated using data from the four most comprehensive environmental assessments. A format for entering environmental assessment results on FORTRAN c...

  14. Car Troubles: An Interpretive Approach.

    ERIC Educational Resources Information Center

    Dawson, Leslie

    1995-01-01

    The growing amount of U.S. surface area being paved increases interpretive opportunities for teaching about the environmental impacts of automobiles. Provides methods and suggestions for educating high school students. Provides several computer graphics. (LZ)

  15. QUANTIFICATION AND INTERPRETATION OF TOTAL PETROLEUM HYDROCARBONS IN SEDIMENT SAMPLES BY A GC/MS METHOD AND COMPARISON WITH EPA 418.1 AND A RAPID FIELD METHOD

    EPA Science Inventory

    ABSTRACT: Total Petroleum hydrocarbons (TPH) as a lumped parameter can be easily and rapidly measured or monitored. Despite interpretational problems, it has become an accepted regulatory benchmark used widely to evaluate the extent of petroleum product contamination. Three cu...

  16. Students' Interpretation of a Function Associated with a Real-Life Problem from Its Graph

    ERIC Educational Resources Information Center

    Mahir, Nevin

    2010-01-01

    The properties of a function such as limit, continuity, derivative, growth, or concavity can be determined more easily from its graph than by doing any algebraic operation. For this reason, it is important for students of mathematics to interpret some of the properties of a function from its graph. In this study, we investigated the competence of…

  17. Variability of Interpretive Accuracy Among Diagnostic Mammography Facilities

    PubMed Central

    Taplin, Stephen H.; Sickles, Edward A.; Abraham, Linn; Barlow, William E.; Carney, Patricia A.; Geller, Berta; Berns, Eric A.; Cutter, Gary R.; Elmore, Joann G.

    2009-01-01

    Background Interpretive performance of screening mammography varies substantially by facility, but performance of diagnostic interpretation has not been studied. Methods Facilities performing diagnostic mammography within three registries of the Breast Cancer Surveillance Consortium were surveyed about their structure, organization, and interpretive processes. Performance measurements (false-positive rate, sensitivity, and likelihood of cancer among women referred for biopsy [positive predictive value of biopsy recommendation {PPV2}]) from January 1, 1998, through December 31, 2005, were prospectively measured. Logistic regression and receiver operating characteristic (ROC) curve analyses, adjusted for patient and radiologist characteristics, were used to assess the association between facility characteristics and interpretive performance. All statistical tests were two-sided. Results Forty-five of the 53 facilities completed a facility survey (85% response rate), and 32 of the 45 facilities performed diagnostic mammography. The analyses included 28?100 diagnostic mammograms performed as an evaluation of a breast problem, and data were available for 118 radiologists who interpreted diagnostic mammograms at the facilities. Performance measurements demonstrated statistically significant interpretive variability among facilities (sensitivity, P = .006; false-positive rate, P < .001; and PPV2, P < .001) in unadjusted analyses. However, after adjustment for patient and radiologist characteristics, only false-positive rate variation remained statistically significant and facility traits associated with performance measures changed (false-positive rate = 6.5%, 95% confidence interval [CI] = 5.5% to 7.4%; sensitivity = 73.5%, 95% CI = 67.1% to 79.9%; and PPV2 = 33.8%, 95% CI = 29.1% to 38.5%). Facilities reporting that concern about malpractice had moderately or greatly increased diagnostic examination recommendations at the facility had a higher false-positive rate (odds ratio [OR] = 1.48, 95% CI = 1.09 to 2.01) and a non–statistically significantly higher sensitivity (OR = 1.74, 95% CI = 0.94 to 3.23). Facilities offering specialized interventional services had a non–statistically significantly higher false-positive rate (OR = 1.97, 95% CI = 0.94 to 4.1). No characteristics were associated with overall accuracy by ROC curve analyses. Conclusions Variation in diagnostic mammography interpretation exists across facilities. Failure to adjust for patient characteristics when comparing facility performance could lead to erroneous conclusions. Malpractice concerns are associated with interpretive performance. PMID:19470953

  18. NF Facts and Statistics

    MedlinePLUS

    ... NF Heroes NF Registry Learn About NF Facts & Statistics NF1 NF2 Schwannomatosis About Us Foundation News & Events ... the One: Holiday 2015 Events STORE DONATE Facts & Statistics · NF has been classified into three distinct types; ...

  19. Environmental statistics and optimal regulation

    NASA Astrophysics Data System (ADS)

    Sivak, David; Thomson, Matt

    2015-03-01

    The precision with which an organism can detect its environment, and the timescale for and statistics of environmental change, will affect the suitability of different strategies for regulating protein levels in response to environmental inputs. We propose a general framework--here applied to the enzymatic regulation of metabolism in response to changing nutrient concentrations--to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, and the costs associated with enzyme production. We find: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.

  20. Statistical analysis of planetary surfaces

    NASA Astrophysics Data System (ADS)

    Schmidt, Frederic; Landais, Francois; Lovejoy, Shaun

    2015-04-01

    In the last decades, a huge amount of topographic data has been obtained by several techniques (laser and radar altimetry, DTM…) for different bodies in the solar system, including Earth, Mars, the Moon etc.. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeter to thousands of kilometers. This complexity seems to prohibit global descriptions or global topography models. Nevertheless, this topographic complexity is well-known to exhibit scaling laws that establish a similarity between scales and permit simpler descriptions and models. Indeed, efficient simulations can be made using the statistical properties of scaling fields (fractals). But realistic simulations of global topographic fields must be multi (not mono) scaling behaviour, reflecting the extreme variability and intermittency observed in real fields that can not be generated by simple scaling models. A multiscaling theory has been developed in order to model high variability and intermittency. This theory is a good statistical candidate to model the topography field with a limited number of parameters (called the multifractal parameters). In our study, we show that statistical properties of the Martian topography is accurately reproduced by this model, leading to new interpretation of geomorphological processes.

  1. The Easily Learned, Easily Remembered Heuristic in Children

    ERIC Educational Resources Information Center

    Koriat, Asher; Ackerman, Rakefet; Lockl, Kathrin; Schneider, Wolfgang

    2009-01-01

    A previous study with adults [Koriat, A. (2008a). "Easy comes, easy goes? The link between learning and remembering and its exploitation in metacognition." "Memory & Cognition," 36, 416-428] established a correlation between learning and remembering: items requiring more trials to acquisition (TTA) were less likely to be recalled than those…

  2. Library Statistics Cooperative Program.

    ERIC Educational Resources Information Center

    National Commission on Libraries and Information Science, Washington, DC.

    The Library Statistics Cooperative Program collects statistics about all types of libraries--academic libraries, public libraries, school library media centers, state library agencies, federal libraries and information centers, and library cooperatives. The Library Statistics Cooperative Program depends on collaboration with all types of libraries…

  3. Minnesota Health Statistics 1988.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Health, St. Paul.

    This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…

  4. Minnesota Health Statistics 1988.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Health, St. Paul.

    This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…

  5. Uterine Cancer Statistics

    MedlinePLUS

    ... Prostate Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ... comparing incidence and death counts. †Source: U.S. Cancer Statistics Working Group. United States Cancer Statistics: 1999–2012 ...

  6. Avoiding Statistical Mistakes

    ERIC Educational Resources Information Center

    Strasser, Nora

    2007-01-01

    Avoiding statistical mistakes is important for educators at all levels. Basic concepts will help you to avoid making mistakes using statistics and to look at data with a critical eye. Statistical data is used at educational institutions for many purposes. It can be used to support budget requests, changes in educational philosophy, changes to…

  7. On suicide statistics.

    PubMed

    Thorslund, J; Misfeldt, J

    1989-07-01

    The classical methodological problem of suicidology is reliability of official statistics. In this article, some recent contributions to the debate, particularly concerning the increased problem of suicide among Inuit, are reviewed. Secondly the suicide statistics of Greenland are analyzed, with the conclusion that the official statistics, as published by the Danish Board of Health, are generally reliable concerning Greenland. PMID:2789569

  8. Statistical quality management

    NASA Astrophysics Data System (ADS)

    Vanderlaan, Paul

    1992-10-01

    Some aspects of statistical quality management are discussed. Quality has to be defined as a concrete, measurable quantity. The concepts of Total Quality Management (TQM), Statistical Process Control (SPC), and inspection are explained. In most cases SPC is better than inspection. It can be concluded that statistics has great possibilities in the field of TQM.

  9. Securing wide appreciation of health statistics

    PubMed Central

    Pyrrait, A. M. DO Amaral; Aubenque, M. J.; Benjamin, B.; DE Groot, Meindert J. W.; Kohn, R.

    1954-01-01

    All the authors are agreed on the need for a certain publicizing of health statistics, but do Amaral Pyrrait points out that the medical profession prefers to convince itself rather than to be convinced. While there is great utility in articles and reviews in the professional press (especially for paramedical personnel) Aubenque, de Groot, and Kohn show how appreciation can effectively be secured by making statistics more easily understandable to the non-expert by, for instance, including readable commentaries in official publications, simplifying charts and tables, and preparing simple manuals on statistical methods. Aubenque and Kohn also stress the importance of linking health statistics to other economic and social information. Benjamin suggests that the principles of market research could to advantage be applied to health statistics to determine the precise needs of the “consumers”. At the same time, Aubenque points out that the value of the ultimate results must be clear to those who provide the data; for this, Kohn suggests that the enumerators must know exactly what is wanted and why. There is general agreement that some explanation of statistical methods and their uses should be given in the curricula of medical schools and that lectures and postgraduate courses should be arranged for practising physicians. PMID:13199668

  10. The Effect Size Statistic: Overview of Various Choices.

    ERIC Educational Resources Information Center

    Mahadevan, Lakshmi

    Over the years, methodologists have been recommending that researchers use magnitude of effect estimates in result interpretation to highlight the distinction between statistical and practical significance (cf. R. Kirk, 1996). A magnitude of effect statistic (i.e., effect size) tells to what degree the dependent variable can be controlled,…

  11. Faculty Salary Equity Cases: Combining Statistics with the Law

    ERIC Educational Resources Information Center

    Luna, Andrew L.

    2006-01-01

    Researchers have used many statistical models to determine whether an institution's faculty pay structure is equitable, with varying degrees of success. Little attention, however, has been given to court interpretations of statistical significance or to what variables courts have acknowledged should be used in an equity model. This article…

  12. ALISE Library and Information Science Education Statistical Report, 1999.

    ERIC Educational Resources Information Center

    Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.

    This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by…

  13. An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics

    ERIC Educational Resources Information Center

    Ellis, Frank B.; Ellis, David C.

    2008-01-01

    Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…

  14. An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics

    ERIC Educational Resources Information Center

    Ellis, Frank B.; Ellis, David C.

    2008-01-01

    Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…

  15. Intelligent Collection Environment for an Interpretation System

    SciTech Connect

    Maurer, W J

    2001-07-19

    An Intelligent Collection Environment for a data interpretation system is described. The environment accepts two inputs: A data model and a number between 0.0 and 1.0. The data model is as simple as a single word or as complex as a multi-level/multidimensional model. The number between 0.0 and 1.0 is a control knob to indicate the user's desire to allow loose matching of the data (things are ambiguous and unknown) versus strict matching of the data (things are precise and known). The environment produces a set of possible interpretations, a set of requirements to further strengthen or to differentiate a particular subset of the possible interpretation from the others, a set of inconsistencies, and a logic map that graphically shows the lines of reasoning used to derive the above output. The environment is comprised of a knowledge editor, model explorer, expertise server, and the World Wide Web. The Knowledge Editor is used by a subject matter expert to define Linguistic Types, Term Sets, detailed explanations, and dynamically created URI's, and to create rule bases using a straight forward hyper matrix representation. The Model Explorer allows rapid construction and browsing of multi-level models. A multi-level model is a model whose elements may also be models themselves. The Expertise Server is an inference engine used to interpret the data submitted. It incorporates a semantic network knowledge representation, an assumption based truth maintenance system, and a fuzzy logic calculus. It can be extended by employing any classifier (e.g. statistical/neural networks) of complex data types. The World Wide Web is an unstructured data space accessed by the URI's supplied as part of the output of the environment. By recognizing the input data model as a query, the environment serves as a deductive search engine. Applications include (but are not limited to) interpretation of geophysical phenomena, a navigation aid for very large web sites, monitoring of computer or sensor networks, customer support, trouble shooting, and searching complex digital libraries (e.g. genome libraries).

  16. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    PubMed Central

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory causal indicators are controversial and little-understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning intended by a researcher. This article questions the validity of evidence used to claim that causal indicators are inherently susceptible to interpretational confounding. Further, a simulation study demonstrates that causal indicator coefficients are stable across correctly-specified models. Determining the suitability of causal indicators has implications for the way we conceptualize measurement and build and evaluate measurement models. PMID:25530730

  17. A genealogical interpretation of principal components analysis.

    PubMed

    McVean, Gil

    2009-10-01

    Principal components analysis, PCA, is a statistical method commonly used in population genetics to identify structure in the distribution of genetic variation across geographical location and ethnic background. However, while the method is often used to inform about historical demographic processes, little is known about the relationship between fundamental demographic parameters and the projection of samples onto the primary axes. Here I show that for SNP data the projection of samples onto the principal components can be obtained directly from considering the average coalescent times between pairs of haploid genomes. The result provides a framework for interpreting PCA projections in terms of underlying processes, including migration, geographical isolation, and admixture. I also demonstrate a link between PCA and Wright's f(st) and show that SNP ascertainment has a largely simple and predictable effect on the projection of samples. Using examples from human genetics, I discuss the application of these results to empirical data and the implications for inference. PMID:19834557

  18. Water isotope systematics: Improving our palaeoclimate interpretations

    NASA Astrophysics Data System (ADS)

    Jones, M. D.; Dee, S.; Anderson, L.; Baker, A.; Bowen, G.; Noone, D. C.

    2016-01-01

    The stable isotopes of oxygen and hydrogen, measured in a variety of archives, are widely used proxies in Quaternary Science. Understanding the processes that control δ18O change have long been a focus of research (e.g. Shackleton and Opdyke, 1973; Talbot, 1990; Leng, 2006). Both the dynamics of water isotope cycling and the appropriate interpretation of geological water-isotope proxy time series remain subjects of active research and debate. It is clear that achieving a complete understanding of the isotope systematics for any given archive type, and ideally each individual archive, is vital if these palaeo-data are to be used to their full potential, including comparison with climate model experiments of the past. Combining information from modern monitoring and process studies, climate models, and proxy data is crucial for improving our statistical constraints on reconstructions of past climate variability.

  19. Artificial intelligence and statistics

    SciTech Connect

    Gale, W.A.

    1987-01-01

    This book explores the possible applications of artificial intelligence in statistics and conversely, statistics in artificial intelligence. It is a collection of seventeen papers written by leaders in the field. Most of the papers were prepared for the Workshop on Artificial Intelligence and Statistics held in April 1985 and sponsored by ATandT Bell Laboratories. The book is divided into six parts: uncertainly propagation, clustering and learning, expert systems, environments for supporting statistical strategy, knowledge acquisition, and strategy. The editor ties the collection together in the first chapter by providing an overview of AI and statistics, discussing the Workshop, and exploring future research in the field.

  20. Discovery of novel non-peptidic beta-alanine piperazine amide derivatives and their optimization to achiral, easily accessible, potent and selective somatostatin sst1 receptor antagonists.

    PubMed

    Troxler, Thomas; Hurth, Konstanze; Mattes, Henri; Prashad, Mahavir; Schoeffter, Philippe; Langenegger, Daniel; Enz, Albert; Hoyer, Daniel

    2009-03-01

    Structural simplification of the core moieties of obeline and ergoline somatostatin sst(1) receptor antagonists, followed by systematic optimization, led to the identification of novel, highly potent and selective sst(1) receptor antagonists. These achiral, non-peptidic compounds are easily prepared and show promising PK properties in rodents. PMID:19208473

  1. Uses and Abuses of Statistical Significance Tests and Other Statistical Resources: A Comparative Study

    ERIC Educational Resources Information Center

    Monterde-i-Bort, Hector; Frias-Navarro, Dolores; Pascual-Llobell, Juan

    2010-01-01

    The empirical study we present here deals with a pedagogical issue that has not been thoroughly explored up until now in our field. Previous empirical studies in other sectors have identified the opinions of researchers about this topic, showing that completely unacceptable interpretations have been made of significance tests and other statistical…

  2. A Local Interpretation of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Lopez, Carlos

    2015-12-01

    A local interpretation of quantum mechanics is presented. Its main ingredients are: first, a label attached to one of the "virtual" paths in the path integral formalism, determining the output for measurement of position or momentum; second, a mathematical model for spin states, equivalent to the path integral formalism for point particles in space time, with the corresponding label. The mathematical machinery of orthodox quantum mechanics is maintained, in particular amplitudes of probability and Born's rule; therefore, Bell's type inequalities theorems do not apply. It is shown that statistical correlations for pairs of particles with entangled spins have a description completely equivalent to the two slit experiment, that is, interference (wave like behaviour) instead of non locality gives account of the process. The interpretation is grounded in the experimental evidence of a point like character of electrons, and in the hypothetical existence of a wave like, the de Broglie, companion system. A correspondence between the extended Hilbert spaces of hidden physical states and the orthodox quantum mechanical Hilbert space shows the mathematical equivalence of both theories. Paradoxical behaviour with respect to the action reaction principle is analysed, and an experimental set up, modified two slit experiment, proposed to look for the companion system.

  3. Learning Interpretable SVMs for Biological Sequence Classification

    PubMed Central

    Rätsch, Gunnar; Sonnenburg, Sören; Schäfer, Christin

    2006-01-01

    Background Support Vector Machines (SVMs) – using a variety of string kernels – have been successfully applied to biological sequence classification problems. While SVMs achieve high classification accuracy they lack interpretability. In many applications, it does not suffice that an algorithm just detects a biological signal in the sequence, but it should also provide means to interpret its solution in order to gain biological insight. Results We propose novel and efficient algorithms for solving the so-called Support Vector Multiple Kernel Learning problem. The developed techniques can be used to understand the obtained support vector decision function in order to extract biologically relevant knowledge about the sequence analysis problem at hand. We apply the proposed methods to the task of acceptor splice site prediction and to the problem of recognizing alternatively spliced exons. Our algorithms compute sparse weightings of substring locations, highlighting which parts of the sequence are important for discrimination. Conclusion The proposed method is able to deal with thousands of examples while combining hundreds of kernels within reasonable time, and reliably identifies a few statistically significant positions. PMID:16723012

  4. An Interpretation of Banded Magnetospheric Radio Emissions

    NASA Technical Reports Server (NTRS)

    Benson, Robert F.; Osherovich, V. A.; Fainberg, J.; Vinas, A. F.; Ruppert, D. R.; Vondrak, Richard R. (Technical Monitor)

    2000-01-01

    Recently-published Active Magnetospheric Particle Tracer Explorer/Isothermal Remanent Magnetization (AMPTE/IRM) banded magnetospheric emissions, commonly referred to as '(n + 1/2)f(sub ce)' emissions where f(sub ce) is the electron gyrofrequency, are analyzed by treating them as analogous to sounder-stimulated ionospheric emissions. We show that both individual AMPTE/IRM spectra of magnetospheric banded emissions, and a statistically-derived spectra observed over the two-year lifetime of the mission, can be interpreted in a self-consistent manner. The analysis, which predicts all spectral peaks within 4% of the observed peaks, interprets the higher-frequency emissions as due to low group-velocity Bernstein-mode waves and the lower-frequency emissions as eigen modes of cylindrical-electromagnetic-plasma-oscillations. The demarcation between these two classes of emissions is the electron plasma frequency f(sub pe), where an emission is often observed. This f(sub pe), emission is not necessarily the strongest. None of the observed banded emissions were attributed to the upper-hybrid frequency. We present Alouette-2 and ISIS-1 plasma-resonance data, and model electron temperature (T(sub e)) values, to support the argument that the frequency-spectrum of ionospheric sounder-stimulated emissions is not strongly temperature dependent and thus that the interpretation of these emissions in the ionosphere is relevant to other plasmas (such as the magnetosphere) where N(sub e) and T(sub e) can be quite different but where the ratio f(sub pe)/f(sub ce) is identical.

  5. Easily optimize batch pressure filtration

    SciTech Connect

    Brown, T.R.

    1998-02-01

    Several years ago, the author wrote an article describing a design method for batch pressure filtration systems. The method maximizes the production capability of a filtration system by selecting the best operating conditions--filtration time, initial mass flux, operating temperature, and pressure drop. Most often, maximizing production will also optimize total system costs, and both capital/investment and operating expenses. The method requires a large number of calculations, and is time-consuming and cumbersome, even with a personal computer. This article presents a simplified and fast calculation technique. It`s based upon several dimensionless numbers and two graphs that relate the dimensionless numbers to each other.

  6. Easily forgotten: elderly female prisoners.

    PubMed

    Handtke, Violet; Bretschneider, Wiebke; Elger, Bernice; Wangmo, Tenzin

    2015-01-01

    Women form a growing minority within the worldwide prison population and have special needs and distinct characteristics. Within this group exists a smaller sub-group: elderly female prisoners (EFPs) who require tailored social and health interventions that address their unique needs. Data collected from two prisons in Switzerland housing women prisoners were studied. Overall 26 medical records were analyzed, 13 from EFPs (50+ years) and for comparison 13 from young female prisoners (YFPs, 49 years and younger). Additionally, five semi-structured interviews were conducted with EFPs. Using the layer model of vulnerability, three layers of vulnerability were identified: the "prisoner" layer; followed by the layer of "woman"; both of which are encompassed by the layer of "old age." The analysis of these layers resulted in three main areas where EFPs are particularly vulnerable: their status of "double-minority," health and health-care access, and their social relations. Prison administration and policy-makers need to be more sensitive to gender and age related issues in order to remedy these vulnerabilities. PMID:25661851

  7. A Road More Easily Traveled

    ERIC Educational Resources Information Center

    Stanly, Pat

    2009-01-01

    Rough patches occur at both ends of the education pipeline, as students enter community colleges and move on to work or enrollment in four-year institutions. Career pathways--sequences of coherent, articulated, and rigorous career and academic courses that lead to an industry-recognized certificate or a college degree--are a promising approach to…

  8. A Road More Easily Traveled

    ERIC Educational Resources Information Center

    Stanly, Pat

    2009-01-01

    Rough patches occur at both ends of the education pipeline, as students enter community colleges and move on to work or enrollment in four-year institutions. Career pathways--sequences of coherent, articulated, and rigorous career and academic courses that lead to an industry-recognized certificate or a college degree--are a promising approach to…

  9. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning…

  10. The Interpretive Approach to Religious Education: Challenging Thompson's Interpretation

    ERIC Educational Resources Information Center

    Jackson, Robert

    2012-01-01

    In a recent book chapter, Matthew Thompson makes some criticisms of my work, including the interpretive approach to religious education and the research and activity of Warwick Religions and Education Research Unit. Against the background of a discussion of religious education in the public sphere, my response challenges Thompson's account,…

  11. Default Sarcastic Interpretations: On the Priority of Nonsalient Interpretations

    ERIC Educational Resources Information Center

    Giora, Rachel; Drucker, Ari; Fein, Ofer; Mendelson, Itamar

    2015-01-01

    Findings from five experiments support the view that negation generates sarcastic utterance-interpretations by default. When presented in isolation, novel negative constructions ("Punctuality is not his forte," "Thoroughness is not her most distinctive feature"), free of semantic anomaly or internal incongruity, were…

  12. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning…

  13. The Interpretive Approach to Religious Education: Challenging Thompson's Interpretation

    ERIC Educational Resources Information Center

    Jackson, Robert

    2012-01-01

    In a recent book chapter, Matthew Thompson makes some criticisms of my work, including the interpretive approach to religious education and the research and activity of Warwick Religions and Education Research Unit. Against the background of a discussion of religious education in the public sphere, my response challenges Thompson's account,…

  14. Teaching Business Statistics with Real Data to Undergraduates and the Use of Technology in the Class Room

    ERIC Educational Resources Information Center

    Singamsetti, Rao

    2007-01-01

    In this paper an attempt is made to highlight some issues of interpretation of statistical concepts and interpretation of results as taught in undergraduate Business statistics courses. The use of modern technology in the class room is shown to have increased the efficiency and the ease of learning and teaching in statistics. The importance of…

  15. Tractography atlas-based spatial statistics: Statistical analysis of diffusion tensor image along fiber pathways.

    PubMed

    Wang, Defeng; Luo, Yishan; Mok, Vincent C T; Chu, Winnie C W; Shi, Lin

    2016-01-15

    The quantitative analysis of diffusion tensor image (DTI) data has attracted increasing attention in recent decades for studying white matter (WM) integrity and development. Among the current DTI analysis methods, tract-based spatial statistics (TBSS), as a pioneering approach for the voxelwise analysis of DTI data, has gained a lot of popularity due to its user-friendly framework. However, in recent years, the reliability and interpretability of TBSS have been challenged by several works, and several improvements over the original TBSS pipeline have been suggested. In this paper, we propose a new DTI statistical analysis method, named tractography atlas-based spatial statistics (TABSS). It doesn't rely on the accurate alignment of fractional anisotropy (FA) images for population analysis and gets rid of the skeletonization procedures of TBSS, which have been indicated as the major sources of error. Furthermore, TABSS improves the interpretability of results by directly reporting the resulting statistics on WM tracts, waiving the need of a WM atlas in the interpretation of the results. The feasibility of TABSS was evaluated in an example study to show age-related FA alternation pattern of healthy human brain. Through this preliminary study, it is validated that TABSS can provide detailed statistical results in a comprehensive and easy-to-understand way. PMID:26481677

  16. Calibrated Peer Review for Interpreting Linear Regression Parameters: Results from a Graduate Course

    ERIC Educational Resources Information Center

    Enders, Felicity B.; Jenkins, Sarah; Hoverman, Verna

    2010-01-01

    Biostatistics is traditionally a difficult subject for students to learn. While the mathematical aspects are challenging, it can also be demanding for students to learn the exact language to use to correctly interpret statistical results. In particular, correctly interpreting the parameters from linear regression is both a vital tool and a…

  17. Is statistical significance always significant?

    PubMed

    Koretz, Ronald L

    2005-06-01

    One way in which we learn new information is to read the medical literature. Whether or not we do primary research, it is important to be able to read literature in a critical fashion. A seemingly simple concept in reading is to interpret p values. For most of us, if we find a p value that is <.05, we take the conclusion to heart and quote it at every opportunity. If the p value is >.05, we discard the paper and look elsewhere for useful information. Unfortunately, this is too simplistic an approach. The real utility of p values is to consider them within the context of the experiment being performed. Defects in study design can make an interpretation of a p value useless. One has to be wary of type I (seeing a "statistically significant" difference just because of chance) and type II (failing to see a difference that really exists) errors. Examples of the former are publication bias and the performance of multiple analyses; the latter refers to a trial that is too small to demonstrate the difference. Finding significant differences in surrogate or intermediate endpoints may not help us. We need to know if those endpoints reflect the behavior of clinical endpoints. Selectively citing significant differences and disregarding studies that do not find them is inappropriate. Small differences, even if they are statistically significant, may require too much resource expenditure to be clinically useful. This article explores these problems in depth and attempts to put p values in the context of studies. PMID:16207667

  18. Integrative Interpretation of Vocational Interest Inventory Results.

    ERIC Educational Resources Information Center

    Rubinstein, Malcolm R.

    1978-01-01

    Examined effectiveness of interpretation of vocational interest inventory results. Data provide limited support for the hypotheses that integrative interpretation is most effective. Significant interactions exist between counselors and interpretation procedures. Failure to find significant differences between traditional-individual and…

  19. Using Playing Cards to Differentiate Probability Interpretations

    ERIC Educational Resources Information Center

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  20. Evaluation of Computer Simulated Baseline Statistics for Use in Item Bias Studies.

    ERIC Educational Resources Information Center

    Rogers, H. Jane; Hambleton, Ronald K.

    Though item bias statistics are widely recommended for use in test development and analysis, problems arise in their interpretation. This research evaluates logistic test models and computer simulation methods for providing a frame of reference for interpreting item bias statistics. Specifically, the intent was to produce simulated sampling…

  1. Statistical mechanics of community detection.

    PubMed

    Reichardt, Jörg; Bornholdt, Stefan

    2006-07-01

    Starting from a general ansatz, we show how community detection can be interpreted as finding the ground state of an infinite range spin glass. Our approach applies to weighted and directed networks alike. It contains the ad hoc introduced quality function from [J. Reichardt and S. Bornholdt, Phys. Rev. Lett. 93, 218701 (2004)] and the modularity Q as defined by Newman and Girvan [Phys. Rev. E 69, 026113 (2004)] as special cases. The community structure of the network is interpreted as the spin configuration that minimizes the energy of the spin glass with the spin states being the community indices. We elucidate the properties of the ground state configuration to give a concise definition of communities as cohesive subgroups in networks that is adaptive to the specific class of network under study. Further, we show how hierarchies and overlap in the community structure can be detected. Computationally efficient local update rules for optimization procedures to find the ground state are given. We show how the ansatz may be used to discover the community around a given node without detecting all communities in the full network and we give benchmarks for the performance of this extension. Finally, we give expectation values for the modularity of random graphs, which can be used in the assessment of statistical significance of community structure. PMID:16907154

  2. Statistical mechanics of community detection

    NASA Astrophysics Data System (ADS)

    Reichardt, Jörg; Bornholdt, Stefan

    2006-07-01

    Starting from a general ansatz, we show how community detection can be interpreted as finding the ground state of an infinite range spin glass. Our approach applies to weighted and directed networks alike. It contains the ad hoc introduced quality function from [J. Reichardt and S. Bornholdt, Phys. Rev. Lett. 93, 218701 (2004)] and the modularity Q as defined by Newman and Girvan [Phys. Rev. E 69, 026113 (2004)] as special cases. The community structure of the network is interpreted as the spin configuration that minimizes the energy of the spin glass with the spin states being the community indices. We elucidate the properties of the ground state configuration to give a concise definition of communities as cohesive subgroups in networks that is adaptive to the specific class of network under study. Further, we show how hierarchies and overlap in the community structure can be detected. Computationally efficient local update rules for optimization procedures to find the ground state are given. We show how the ansatz may be used to discover the community around a given node without detecting all communities in the full network and we give benchmarks for the performance of this extension. Finally, we give expectation values for the modularity of random graphs, which can be used in the assessment of statistical significance of community structure.

  3. Interpreting Hymns for Deaf Worshippers.

    ERIC Educational Resources Information Center

    Maxwell, Madeline M.; Boster, Shirley

    1982-01-01

    Discusses the special problems of interpreting hymns written in archaic English and then matching words of a translation to music. Addresses the question of whether competence in ASL and knowledge of signs for religious terms are sufficient for hymns to be of value to deaf worshippers. (EKN)

  4. IT1: An Interpretive Tutor.

    ERIC Educational Resources Information Center

    Melton, T. R.

    A computer-assisted instruction system, called IT1 (Interpretive Tutor), is described which is intended to assist a student's efforts to learn the content of textual material and to evaluate his efforts toward that goal. The text is represented internally in the form of semantic networks with auxiliary structures which relate network nodes to…

  5. Interpreter Training Program: Program Review.

    ERIC Educational Resources Information Center

    Massoud, LindaLee

    This report describes in detail the deaf interpreter training program offered at Mott Community College (Flint, Michigan). The program features field-based learning experiences, internships, team teaching, a field practicum, the goal of having students meet certification standards, and proficiency examinations. The program has special…

  6. Interpreting Data: The Hybrid Mind

    ERIC Educational Resources Information Center

    Heisterkamp, Kimberly; Talanquer, Vicente

    2015-01-01

    The central goal of this study was to characterize major patterns of reasoning exhibited by college chemistry students when analyzing and interpreting chemical data. Using a case study approach, we investigated how a representative student used chemical models to explain patterns in the data based on structure-property relationships. Our results…

  7. Interpretation of grease test results

    SciTech Connect

    Rush, R.E.

    1995-09-01

    Standard ASTM tests, their typical results and how those results may be interpreted by the practicing lubrication engineer or specialist in the field will be discussed. Some field experiences and examples will be given. In addition, examples of inventive non-standard field tests will be shown and described. Illustrations from the old and revised lubrication engineers handbook will be used.

  8. Smartberries: Interpreting Erdrich's Love Medicine

    ERIC Educational Resources Information Center

    Treuer, David

    2005-01-01

    The structure of "Love Medicines" interpreted by Hertha D. Sweet Wong who claims that the book's "multiple narrators confound conventional Western expectations of an autonomous protagonist, a dominant narrative voice, and a consistently chronological narrative". "Love Medicine" is a brilliant use of the Western literary tactics that create the…

  9. Interpreting Data: The Hybrid Mind

    ERIC Educational Resources Information Center

    Heisterkamp, Kimberly; Talanquer, Vicente

    2015-01-01

    The central goal of this study was to characterize major patterns of reasoning exhibited by college chemistry students when analyzing and interpreting chemical data. Using a case study approach, we investigated how a representative student used chemical models to explain patterns in the data based on structure-property relationships. Our results…

  10. Design Document. EKG Interpretation Program.

    ERIC Educational Resources Information Center

    Webb, Sandra M.

    This teaching plan is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in acquainting students with the basic skills needed to perform electrocardiographic (ECG or EKG) interpretations. The first part of the teaching plan contains a statement of purpose; audience recommendations; a flow chart detailing…

  11. Design Document. EKG Interpretation Program.

    ERIC Educational Resources Information Center

    Webb, Sandra M.

    This teaching plan is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in acquainting students with the basic skills needed to perform electrocardiographic (ECG or EKG) interpretations. The first part of the teaching plan contains a statement of purpose; audience recommendations; a flow chart detailing…

  12. EKG Interpretation Program. Trainers Manual.

    ERIC Educational Resources Information Center

    Webb, Sandra M.

    This trainer's manual is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in teaching students how to make basic interpretations of their patients' electrocardiographic (EKG) strips. Included in the manual are pre- and posttests and instructional units dealing with the following topics: EKG indicators,…

  13. Interpretive Reproduction in Children's Play

    ERIC Educational Resources Information Center

    Corsaro, William A.

    2012-01-01

    The author looks at children's play from the perspective of interpretive reproduction, emphasizing the way children create their own unique peer cultures, which he defines as a set of routines, artifacts, values, and concerns that children engage in with their playmates. The article focuses on two types of routines in the peer culture of preschool…

  14. Focus: Oral Interpretation and Drama.

    ERIC Educational Resources Information Center

    Mullican, James S., Ed.

    1976-01-01

    The 12 articles in this issue of "Indiana English Journal" are concerned with drama and oral interpretation in the classroom. Titles of articles are: "Up in the Tree, Down in the Cave, and Back to Reading: Creative Dramatics"; "Pantomime: The Stepping Stone to Drama"; "The Living Literature of Readers' Theatre"; "Do-It-Yourself Drama"; "Drama for…

  15. Interpreting chromosomal abnormalities using Prolog.

    PubMed

    Cooper, G; Friedman, J M

    1990-04-01

    This paper describes an expert system for interpreting the standard notation used to represent human chromosomal abnormalities, namely, the International System for Human Cytogenetic Nomenclature. Written in Prolog, this program is very powerful, easy to maintain, and portable. The system can be used as a front end to any database that employs cytogenetic notation, such as a patient registry. PMID:2185921

  16. Test Development, Interpretation, and Use.

    ERIC Educational Resources Information Center

    Ebel, Robert L.

    The 54 papers related to test development, interpretation, and use that were presented at the 1972 AERA Conference are reviewed. The papers were classified into 11 categories, as follows: A. What to measure--educational objectives; attitude measurement; and creativity; B. How to measure--item types; test development; response modifications;…

  17. Remote sensing: Principles and interpretation

    SciTech Connect

    Sabins, F.F. Jr.

    1986-01-01

    This book includes explanations of modern remote sensing systems and the skills needed to interpret imaging technology. Examples are provided of imaging systems such as Landsat Thematic Mapper, Seasat, Heat Capacity Mapping Mission, Space Shuttle Imaging Radar, Large Format Camera, Advanced Very High Resolution Radiometer, Coastal Zone Scanner, and Thermal Infrared Multispectral Scanner.

  18. Plague Maps and Statistics

    MedlinePLUS

    ... Transmission Symptoms Diagnosis & Treatment Maps & Statistics Info for Healthcare Professionals Clinicians Public Health Officials Veterinarians ... in the United States Plague was first introduced into the United States ...

  19. Clustering statistics in cosmology

    NASA Astrophysics Data System (ADS)

    Martinez, Vicent; Saar, Enn

    2002-12-01

    The main tools in cosmology for comparing theoretical models with the observations of the galaxy distribution are statistical. We will review the applications of spatial statistics to the description of the large-scale structure of the universe. Special topics discussed in this talk will be: description of the galaxy samples, selection effects and biases, correlation functions, Fourier analysis, nearest neighbor statistics, Minkowski functionals and structure statistics. Special attention will be devoted to scaling laws and the use of the lacunarity measures in the description of the cosmic texture.

  20. A Positive Interpretation of Apparent "Cumulative Deficit."

    ERIC Educational Resources Information Center

    Kamin, Leon J.

    1978-01-01

    Suggests an alternate, and optimistic, interpretation of developmental data that has been interpreted as indicating cumulative deficit in IQ among socioeconomically deprived Black children. (Author/SS)

  1. Interpreting Sky-Averaged 21-cm Measurements

    NASA Astrophysics Data System (ADS)

    Mirocha, Jordan

    2015-01-01

    Within the first ~billion years after the Big Bang, the intergalactic medium (IGM) underwent a remarkable transformation, from a uniform sea of cold neutral hydrogen gas to a fully ionized, metal-enriched plasma. Three milestones during this epoch of reionization -- the emergence of the first stars, black holes (BHs), and full-fledged galaxies -- are expected to manifest themselves as extrema in sky-averaged ("global") measurements of the redshifted 21-cm background. However, interpreting these measurements will be complicated by the presence of strong foregrounds and non-trivialities in the radiative transfer (RT) modeling required to make robust predictions.I have developed numerical models that efficiently solve the frequency-dependent radiative transfer equation, which has led to two advances in studies of the global 21-cm signal. First, frequency-dependent solutions facilitate studies of how the global 21-cm signal may be used to constrain the detailed spectral properties of the first stars, BHs, and galaxies, rather than just the timing of their formation. And second, the speed of these calculations allows one to search vast expanses of a currently unconstrained parameter space, while simultaneously characterizing the degeneracies between parameters of interest. I find principally that (1) physical properties of the IGM, such as its temperature and ionization state, can be constrained robustly from observations of the global 21-cm signal without invoking models for the astrophysical sources themselves, (2) translating IGM properties to galaxy properties is challenging, in large part due to frequency-dependent effects. For instance, evolution in the characteristic spectrum of accreting BHs can modify the 21-cm absorption signal at levels accessible to first generation instruments, but could easily be confused with evolution in the X-ray luminosity star-formation rate relation. Finally, (3) the independent constraints most likely to aide in the interpretation of global 21-cm signal measurements are detections of Lyman Alpha Emitters at high redshifts and constraints on the midpoint of reionization, both of which are among the primary science objectives of ongoing or near-future experiments.

  2. Multidimensional Visual Statistical Learning

    ERIC Educational Resources Information Center

    Turk-Browne, Nicholas B.; Isola, Phillip J.; Scholl, Brian J.; Treat, Teresa A.

    2008-01-01

    Recent studies of visual statistical learning (VSL) have demonstrated that statistical regularities in sequences of visual stimuli can be automatically extracted, even without intent or awareness. Despite much work on this topic, however, several fundamental questions remain about the nature of VSL. In particular, previous experiments have not…

  3. Reform in Statistical Education

    ERIC Educational Resources Information Center

    Huck, Schuyler W.

    2007-01-01

    Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…

  4. DISABILITY STATISTICS CENTER

    EPA Science Inventory

    The purpose of the Disability Statistics Center is to produce and disseminate statistical information on disability and the status of people with disabilities in American society and to establish and monitor indicators of how conditions are changing over time to meet their health...

  5. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  6. Adaptive training class statistics.

    NASA Technical Reports Server (NTRS)

    Kan, E. P. F.

    1973-01-01

    Formulas are derived for updating the mean vector and covariance matrix of a training class as new training fields are included and old training fields deleted from the class. These statistics of the class are expressed in terms of the already available statistics of the fields.

  7. On Statistical Testing.

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…

  8. Statistical Mapping by Computer.

    ERIC Educational Resources Information Center

    Utano, Jack J.

    The function of a statistical map is to provide readers with a visual impression of the data so that they may be able to identify any geographic characteristics of the displayed phenomena. The increasingly important role played by the computer in the production of statistical maps is manifested by the varied examples of computer maps in recent…

  9. [Statistics quantum satis].

    PubMed

    Pestana, Dinis

    2013-01-01

    Statistics is a privileged tool in building knowledge from information, since the purpose is to extract from a sample limited information conclusions to the whole population. The pervasive use of statistical software (that always provides an answer, the question being adequate or not), and the absence of statistics to confer a scientific flavour to so much bad science, has had a pernicious effect on some disbelief on statistical research. Would Lord Rutherford be alive today, it is almost certain that he would not condemn the use of statistics in research, as he did in the dawn of the 20th century. But he would indeed urge everyone to use statistics quantum satis, since to use bad data, too many data, and statistics to enquire on irrelevant questions, is a source of bad science, namely because with too many data we can establish statistical significance of irrelevant results. This is an important point that addicts of evidence based medicine should be aware of, since the meta analysis of two many data will inevitably establish senseless results. PMID:24192087

  10. Application Statistics 1987.

    ERIC Educational Resources Information Center

    Council of Ontario Universities, Toronto.

    Summary statistics on application and registration patterns of applicants wishing to pursue full-time study in first-year places in Ontario universities (for the fall of 1987) are given. Data on registrations were received indirectly from the universities as part of their annual submission of USIS/UAR enrollment data to Statistics Canada and MCU.…

  11. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  12. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  13. Applied Statistics with SPSS

    ERIC Educational Resources Information Center

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  14. Applied Statistics with SPSS

    ERIC Educational Resources Information Center

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  15. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  16. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…

  17. Overhead Image Statistics

    SciTech Connect

    Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A

    2008-01-01

    Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.

  18. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…

  19. Explorations in Statistics: Regression

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2011-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive connection.…

  20. Explorations in Statistics: Correlation

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This sixth installment of "Explorations in Statistics" explores correlation, a familiar technique that estimates the magnitude of a straight-line relationship between two variables. Correlation is meaningful only when the…

  1. Interpretation of fluorescence correlation spectra of biopolymer solutions.

    PubMed

    Phillies, George D J

    2016-05-01

    Fluorescence correlation spectroscopy (FCS) is regularly used to study diffusion in non-dilute "crowded" biopolymer solutions, including the interior of living cells. For fluorophores in dilute solution, the relationship between the FCS spectrum G(t) and the diffusion coefficient D is well-established. However, the dilute-solution relationship between G(t) and D has sometimes been used to interpret FCS spectra of fluorophores in non-dilute solutions. Unfortunately, the relationship used to interpret FCS spectra in dilute solutions relies on an assumption that is not always correct in non-dilute solutions. This paper obtains the correct form for interpreting FCS spectra of non-dilute solutions, writing G(t) in terms of the statistical properties of the fluorophore motions. Approaches for applying this form are discussed. © 2016 Wiley Periodicals, Inc. Biopolymers 105: 260-266, 2016. PMID:26756528

  2. Stratigraphic statistical curvature analysis techniques

    SciTech Connect

    Bengtson, C.A.; Ziagos, J.P.

    1987-05-01

    SCAT applies statistical techniques to dipmeter data to identify patterns of bulk curvature, determine transverse and longitudinal structural directions, and reconstruct cross sections and contour maps. STRAT-SCAT applies the same concepts to geometric interpretation of multistoried unimodal, bimodal, or trough-type cross-bedding and also to seismic stratigraphy-scale stratigraphic structures. Structural dip, which comprises the bulk of dipmeter data, is related to beds that (statistically) were deposited with horizontal attitudes; stratigraphic dip is related to beds that were deposited with preferentially oriented nonhorizontal attitudes or to beds that assumed such attitudes because of differential compaction. Stratigraphic dip generates local zones of departure from structural dip on special SCAT plots. The RMS (root-mean-square) of apparent structural dip is greatest in the (structural) T-direction and least in the perpendicular L-direction; the RMS of stratigraphic dip (measured with respect to structural dip) is greatest in the stratigraphic T*-direction and least in the stratigraphic L*-direction. Multistoried, cross-bedding appears on T*-plots as local zones of either greater scatter or statistically significant departure of stratigraphic median dip from structural dip. In contrast, the L*-plot (except for trough-type cross-bedding) is sensitive to cross-bedding. Seismic stratigraphy-scale depositional sequences are identified on Mercator dip versus azimuth plots and polar tangent plots as secondary cylindrical-fold patterns imposed on global structural patterns. Progradational sequences generate local cycloid-type patterns on T*-plots, and compactional sequences generate local cycloid-type patterns on T*-plots, and compactional sequences generate local half-cusp patterns. Both features, however, show only structural dip on L*-plots.

  3. Environmental Statistics and Optimal Regulation

    PubMed Central

    2014-01-01

    Any organism is embedded in an environment that changes over time. The timescale for and statistics of environmental change, the precision with which the organism can detect its environment, and the costs and benefits of particular protein expression levels all will affect the suitability of different strategies–such as constitutive expression or graded response–for regulating protein levels in response to environmental inputs. We propose a general framework–here specifically applied to the enzymatic regulation of metabolism in response to changing concentrations of a basic nutrient–to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, respectively, and the costs associated with enzyme production. We use this framework to address three fundamental questions: (i) when a cell should prefer thresholding to a graded response; (ii) when there is a fitness advantage to implementing a Bayesian decision rule; and (iii) when retaining memory of the past provides a selective advantage. We specifically find that: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones. PMID:25254493

  4. Environmental statistics and optimal regulation.

    PubMed

    Sivak, David A; Thomson, Matt

    2014-09-01

    Any organism is embedded in an environment that changes over time. The timescale for and statistics of environmental change, the precision with which the organism can detect its environment, and the costs and benefits of particular protein expression levels all will affect the suitability of different strategies--such as constitutive expression or graded response--for regulating protein levels in response to environmental inputs. We propose a general framework-here specifically applied to the enzymatic regulation of metabolism in response to changing concentrations of a basic nutrient-to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, respectively, and the costs associated with enzyme production. We use this framework to address three fundamental questions: (i) when a cell should prefer thresholding to a graded response; (ii) when there is a fitness advantage to implementing a Bayesian decision rule; and (iii) when retaining memory of the past provides a selective advantage. We specifically find that: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones. PMID:25254493

  5. College Students' Interpretation of Research Reports on Group Differences: The Tall-Tale Effect

    ERIC Educational Resources Information Center

    Hogan, Thomas P.; Zaboski, Brian A.; Perry, Tiffany R.

    2015-01-01

    How does the student untrained in advanced statistics interpret results of research that reports a group difference? In two studies, statistically untrained college students were presented with abstracts or professional associations' reports and asked for estimates of scores obtained by the original participants in the studies. These estimates…

  6. Modeling and interpretation of images

    NASA Astrophysics Data System (ADS)

    Min, Michiel

    2015-09-01

    Imaging protoplanetary disks is a challenging but rewarding task. It is challenging because of the glare of the central star outshining the weak signal from the disk at shorter wavelengths and because of the limited spatial resolution at longer wavelengths. It is rewarding because it contains a wealth of information on the structure of the disks and can (directly) probe things like gaps and spiral structure. Because it is so challenging, telescopes are often pushed to their limitations to get a signal. Proper interpretation of these images therefore requires intimate knowledge of the instrumentation, the detection method, and the image processing steps. In this chapter I will give some examples and stress some issues that are important when interpreting images from protoplanetary disks. 15th Lecture from Summer School "Protoplanetary Disks: Theory and Modelling Meet Observations"

  7. Interpreting Recoil for Undergraduate Students

    NASA Astrophysics Data System (ADS)

    Elsayed, Tarek A.

    2012-04-01

    The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is closely related to Newton's third law. Since the actual microscopic causes of recoil differ from one problem to another, some students (and teachers) may not be satisfied with understanding recoil through the principles of conservation of linear momentum and Newton's third law. For these students, the origin of the recoil motion should be presented in more depth.

  8. Modelling Metamorphism by Abstract Interpretation

    NASA Astrophysics Data System (ADS)

    Dalla Preda, Mila; Giacobazzi, Roberto; Debray, Saumya; Coogan, Kevin; Townsend, Gregg M.

    Metamorphic malware apply semantics-preserving transformations to their own code in order to foil detection systems based on signature matching. In this paper we consider the problem of automatically extract metamorphic signatures from these malware. We introduce a semantics for self-modifying code, later called phase semantics, and prove its correctness by showing that it is an abstract interpretation of the standard trace semantics. Phase semantics precisely models the metamorphic code behavior by providing a set of traces of programs which correspond to the possible evolutions of the metamorphic code during execution. We show that metamorphic signatures can be automatically extracted by abstract interpretation of the phase semantics, and that regular metamorphism can be modelled as finite state automata abstraction of the phase semantics.

  9. Computer Interpretations of ECGs in Rural Hospitals

    PubMed Central

    Thompson, James M.

    1992-01-01

    Computer-assisted interpretation of electrocardiograms offers theoretical benefits to rural physicians. This study compared computer-assisted interpretations by a rural physician certified to read ECGs with interpretations by the computer alone. The computer interpretation alone could have led to major errors in patient management, but was correct sufficiently often to warrant purchase by small rural hospitals. PMID:21221365

  10. 8 CFR 1240.5 - Interpreter.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...

  11. 8 CFR 1240.5 - Interpreter.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...

  12. 8 CFR 1240.5 - Interpreter.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...

  13. Statistical methods for environmental pollution monitoring

    SciTech Connect

    Gilbert, R.O.

    1986-01-01

    This volume covers planning, design, and data analysis. It offers statistical methods for designing environmental sampling and monitoring programs as well as analyzing the resulting data. Statistical sample survey methods to problems of estimating average and total amounts of environmental pollution are presented in detail. The book also provides a broad array of statistical analysis methods for many purposes...numerous examples...three case studies...end-of-chapter questions...computer codes (showing what output looks like along with its interpretation)...a discussion of Kriging methods for estimating pollution concentration contours over space and/or time...nomographs for determining the number of samples required to detect hot spots with specified confidence...and a description and tables for conducting Rosner's test to identify outlaying (usually large) pollution measurements in a data set.

  14. Statistics: A Brief Overview

    PubMed Central

    Winters, Ryan; Winters, Andrew; Amedee, Ronald G.

    2010-01-01

    The Accreditation Council for Graduate Medical Education sets forth a number of required educational topics that must be addressed in residency and fellowship programs. We sought to provide a primer on some of the important basic statistical concepts to consider when examining the medical literature. It is not essential to understand the exact workings and methodology of every statistical test encountered, but it is necessary to understand selected concepts such as parametric and nonparametric tests, correlation, and numerical versus categorical data. This working knowledge will allow you to spot obvious irregularities in statistical analyses that you encounter. PMID:21603381

  15. R.A. Fisher's contributions to genetical statistics.

    PubMed

    Thompson, E A

    1990-12-01

    R. A. Fisher (1890-1962) was a professor of genetics, and many of his statistical innovations found expression in the development of methodology in statistical genetics. However, whereas his contributions in mathematical statistics are easily identified, in population genetics he shares his preeminence with Sewall Wright (1889-1988) and J. B. S. Haldane (1892-1965). This paper traces some of Fisher's major contributions to the foundations of statistical genetics, and his interactions with Wright and with Haldane which contributed to the development of the subject. With modern technology, both statistical methodology and genetic data are changing. Nonetheless much of Fisher's work remains relevant, and may even serve as a foundation for future research in the statistical analysis of DNA data. For Fisher's work reflects his view of the role of statistics in scientific inference, expressed in 1949: There is no wide or urgent demand for people who will define methods of proof in set theory in the name of improving mathematical statistics. There is a widespread and urgent demand for mathematicians who understand that branch of mathematics known as theoretical statistics, but who are capable also of recognising situations in the real world to which such mathematics is applicable. In recognising features of the real world to which his models and analyses should be applicable, Fisher laid a lasting foundation for statistical inference in genetic analyses. PMID:2085639

  16. Understanding Solar Flare Statistics

    NASA Astrophysics Data System (ADS)

    Wheatland, M. S.

    2005-12-01

    A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.

  17. Teaching Statistics with Minitab.

    ERIC Educational Resources Information Center

    Hubbard, Ruth

    1992-01-01

    Discusses the use of the computer software MINITAB in teaching statistics to explore concepts, simulate games of chance, transform the normal variable into a z-score, and stimulate small and large group discussions. (MDH)

  18. Titanic: A Statistical Exploration.

    ERIC Educational Resources Information Center

    Takis, Sandra L.

    1999-01-01

    Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)

  19. Playing at Statistical Mechanics

    ERIC Educational Resources Information Center

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  20. Cooperative Learning in Statistics.

    ERIC Educational Resources Information Center

    Keeler, Carolyn M.; And Others

    1994-01-01

    Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)

  1. Interpretation of a compositional time series

    NASA Astrophysics Data System (ADS)

    Tolosana-Delgado, R.; van den Boogaart, K. G.

    2012-04-01

    Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA. In this data set, the proportion of annual precipitation falling in winter, spring, summer and autumn is considered a 4-component time series. Three invertible log-ratios are defined for calculations, balancing rainfall in autumn vs. winter, in summer vs. spring, and in autumn-winter vs. spring-summer. Results suggest a 2-year correlation range, and certain oscillatory behaviour in the last balance, which does not occur in the other two.

  2. A new statistical tool for NOAA local climate studies

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.

    2011-12-01

    The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially-weighted moving average. Root mean squared error is used to determine the best fit of trend to the observations with the least error. The studies of climate variability impacts on local extremes use composite techniques applied to various definitions of local variables: from specified percentiles to critical thresholds. Drought studies combine visual capabilities of Google maps with statistical estimates of drought severity indices. The process of development will be linked to local office interactions with users to ensure the tool will meet their needs as well as provide adequate training. A rigorous internal and tiered peer-review process will be implemented to ensure the studies are scientifically-sound that will be published and submitted to the local studies catalog (database) and eventually to external sources, such as the Climate Portal.

  3. Presenting the statistical results.

    PubMed

    Ng, K H; Peh, W C G

    2009-01-01

    Statistical methods are reported in a scientific paper to summarise the data that has been collected for a study and to enable its analysis. These methods should be described with enough detail to allow a knowledgeable reader who has access to the original data to verify the reported results. This article provides basic guidelines to aid authors in reporting the statistical aspects of the results of their studies clearly and accurately. PMID:19224078

  4. Transportation Statistics Annual Report 1997

    SciTech Connect

    Fenn, M.

    1997-01-01

    This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these accessibility patterns? How are commodity flows and transportation services responding to global competition, deregulation, economic restructuring, and new information technologies? How do U.S. patterns of personal mobility and freight movement compare with other advanced industrialized countries, formerly centrally planned economies, and major newly industrializing countries? Finally, how is the rapid adoption of new information technologies influencing the patterns of transportation demand and the supply of new transportation services? Indeed, how are information technologies affecting the nature and organization of transportation services used by individuals and firms?

  5. Interpreting Arterial Blood Gases Successfully.

    PubMed

    Larkin, Brenda G; Zimmanck, Robert J

    2015-10-01

    Arterial blood gas (ABG) analysis is a crucial skill for perioperative nurses, in particular the RN circulator. This article provides the physiological basis for assessing ABGs perioperatively and presents a systematic approach to blood gas analysis using the Romanski method. Blood gas sample data allow the reader to practice ABG interpretation. In addition, four case studies are presented that give the reader the opportunity to analyze ABGs within the context of surgical patient scenarios. The ability to accurately assess ABGs allows the perioperative nurse to assist surgical team members in restoring a patient's acid-base balance. PMID:26411819

  6. Genetics in geographically structured populations: defining, estimating and interpreting FST

    PubMed Central

    Holsinger, Kent E.; Weir, Bruce S.

    2015-01-01

    Wright’s F-statistics, and especially FST, provide important insights into the evolutionary processes that influence the structure of genetic variation within and among populations, and they are among the most widely used descriptive statistics in population and evolutionary genetics. Estimates of FST can identify regions of the genome that have been the target of selection, and comparisons of FST from different parts of the genome can provide insights into the demographic history of populations. For these reasons and others, FST has a central role in population and evolutionary genetics and has wide applications in fields that range from disease association mapping to forensic science. This Review clarifies how FST is defined, how it should be estimated, how it is related to similar statistics and how estimates of FST should be interpreted. PMID:19687804

  7. Catalytic Nonoxidation Dehydrogenation of Ethane Over Fe-Ni Catalysts Supported on Mg (Al)O to Produce Hydrogen and Easily Purified Carbon Nanotubes

    SciTech Connect

    Shen,W.; Wang, Y.; Shi, X.; Shah, N.; Huggins, F.; Bollineni, S.; Seehra, M.; Huffman, G.

    2007-01-01

    Nonoxidative decomposition of ethane was conducted over monometallic Ni and bimetallic Fe-Ni catalysts on basic Mg(Al)O support to produce H2 free of CO and CO2 and easily purified carbon nanotubes, a potentially valuable byproduct. The Mg(Al)O support was prepared by calcination of synthetic MgAl-hydrotalcite with a Mg to Al ratio of 5. The catalysts were prepared by incipient wetness with total metal loadings of 5 wt %. The dehydrogenation of undiluted ethane was conducted at temperatures of 500, 650, and 700 C. At 500 C, the Ni/Mg(Al)O catalyst was highly active and very stable with 100% conversion of ethane to 20 vol % H2 and 80 vol % CH4. However, the bimetallic Fe-Ni/Mg(Al)O exhibited its best performance at 650 C, yielding 65 vol % H2, 10 vol % CH4, and 25 vol % unreacted ethane. The product carbon was in the form of carbon nanotubes (CNT) at all three reaction temperatures, but the morphology of the CNT depended on both the catalyst composition and reaction temperature. The CNTs were formed by a tip-growth mechanism over the Mg(Al)O supported catalysts and were easily purified by a one-step dilute nitric acid treatment. Mossbauer spectroscopy, X-ray absorption fine structure spectroscopy, N2 adsorption-desorption isotherms, TEM, STEM, TGA, and XRD were used to characterize the catalysts and the CNT, revealing the catalytic mechanisms.

  8. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    ERIC Educational Resources Information Center

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  9. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    ERIC Educational Resources Information Center

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  10. Classification methods for computerized interpretation of the electrocardiogram.

    PubMed

    Kors, J A; van Bemmel, J H

    1990-09-01

    Two methods for diagnostic classification of the electrocardiogram are described: a heuristic one and a statistical one. In the heuristic approach, the cardiologist provides the knowledge to construct a classifier, usually a decision tree. In the statistical approach, probability densities of diagnostic features are estimated from a learning set of ECGs and multivariate techniques are used to attain diagnostic classification. The relative merits of both approaches with respect to criteria selection, comprehensibility, flexibility, combined diseases, and performance are described. Optimization of heuristic classifiers is discussed. It is concluded that heuristic classifiers are more comprehensible than statistical ones; encounter less difficulties in dealing with combined categories; are flexible in the sense that new categories may readily be added or that existing ones may be refined stepwise. Statistical classifiers, on the other hand, are more easily adapted to another operating environment and require less involvement of cardiologists. Further research is needed to establish differences in performance between both methods. In relation to performance testing the issue is raised whether the ECG should be classified using as much prior information as possible, or whether it should be classified on itself, explicitly discarding information other than age and sex, while only afterwards other information will be used to reach a final diagnosis. Consequences of taking one of both positions are discussed. PMID:2233379

  11. Directionality Effects in Simultaneous Language Interpreting: The Case of Sign Language Interpreters in the Netherlands

    ERIC Educational Resources Information Center

    van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of the Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives…

  12. Biostratinomic utility of Archimedes in environmental interpretation

    SciTech Connect

    Wulff, J.I. )

    1990-04-01

    Biostratinomic information from the bryozoan Archimedes can be used to infer paleocurrent senses when other more traditional sedimentary structures are lacking. As with other elongate particles, Archimedes zooaria become oriented in the current and, upon settling, preserve a sense of the flow direction. Orientations and lengths were measured on over 200 individuals from bedding plane exposures in the Upper Mississippian Union Limestone (Greenbrier Group) of West Virginia. These were separated into long and short populations and plotted on rose diagrams. The results show that long and short segments become preferentially oriented in the current and the bimodally distributed long segments can be used to infer the current sense. The current sense is defined by the line which bisects the obtuse angle created by the two maxima in the rose diagram for long segments. Statistical evaluation of the long and short populations indicate they are significant to the 99.9 percent level. Elongate fossils such as Archimedes can be used in paleocurrent evaluations and can add more detail to the interpretation of paleodepositional conditions.

  13. Statistical Physics of Fracture

    SciTech Connect

    Alava, Mikko; Nukala, Phani K; Zapperi, Stefano

    2006-05-01

    Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.

  14. Statistical Downscaling: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Walton, D.; Hall, A. D.; Sun, F.

    2013-12-01

    In this study, we examine ways to improve statistical downscaling of general circulation model (GCM) output. Why do we downscale GCM output? GCMs have low resolution, so they cannot represent local dynamics and topographic effects that cause spatial heterogeneity in the regional climate change signal. Statistical downscaling recovers fine-scale information by utilizing relationships between the large-scale and fine-scale signals to bridge this gap. In theory, the downscaled climate change signal is more credible and accurate than its GCM counterpart, but in practice, there may be little improvement. Here, we tackle the practical problems that arise in statistical downscaling, using temperature change over the Los Angeles region as a test case. This region is an ideal place to apply downscaling since its complex topography and shoreline are poorly simulated by GCMs. By comparing two popular statistical downscaling methods and one dynamical downscaling method, we identify issues with statistically downscaled climate change signals and develop ways to fix them. We focus on scale mismatch, domain of influence, and other problems - many of which users may be unaware of - and discuss practical solutions.

  15. Data interpretation in breath biomarker research: pitfalls and directions.

    PubMed

    Miekisch, Wolfram; Herbig, Jens; Schubert, Jochen K

    2012-09-01

    Most--if not all--potential diagnostic applications in breath research involve different marker concentrations rather than unique breath markers which only occur in the diseased state. Hence, data interpretation is a crucial step in breath analysis. To avoid artificial significance in breath testing every effort should be made to implement method validation, data cross-testing and statistical validation along this process. The most common data analysis related problems can be classified into three groups: confounding variables (CVs), which have a real correlation with both the diseased state and a breath marker but lead to the erroneous conclusion that disease and breath are in a causal relationship; voodoo correlations (VCs), which can be understood as statistically true correlations that arise coincidentally in the vast number of measured variables; and statistical misconceptions in the study design (SMSD). CV: Typical confounding variables are environmental and medical history, host factors such as gender, age, weight, etc and parameters that could affect the quality of breath data such as subject breathing mode, effects of breath sampling and effects of the analytical technique itself. VC: The number of measured variables quickly overwhelms the number of samples that can feasibly be taken. As a consequence, the chances of finding coincidental 'voodoo' correlations grow proportionally. VCs can typically be expected in the following scenarios: insufficient number of patients, (too) many measurement variables, the use of advanced statistical data mining methods, and non-independent data for validation. SMSD: Non-prospective, non-blinded and non-randomized trials, a priori biased study populations or group selection with unrealistically high disease prevalence typically represent misconception of study design. In this paper important data interpretation issues are discussed, common pitfalls are addressed and directions for sound data processing and interpretation are proposed. PMID:22854185

  16. Statistical models of flares

    NASA Astrophysics Data System (ADS)

    Isliker, H.

    By 'statistical' models of flares we denote the global stochastic models of the dynamics of the energy-release process and its associated phenomena which consider flares to consist in a large number of constituent small-scale processes. The observations strongly support such a kind of models: a) Radio-and HXR-emission of flares are highly fragmented in space and time, suggesting that the flare process itself is spatially and temporarily fragmented (De Jager and De Jonge 1978, Benz 1985, Aschwanden et al. 1990). b) The temporal dynamics of flares has been shown to be 'complex' (relatively high-dimensional chaotic or stochastic) through time-series analysis of radio-emission (dimension-estimate and power-spectra: Isliker and Benz (1994), Isliker (1996), Ryabov et al. (1997); wavelet transform: Aschwanden et al. 1998, Schwarz et al. 1998). c) Spatially, there are only weak and local correlations between neighbouring burst-sites, reminiscent of a chain-reaction (analysis of nb-spikes spectrograms with symbolic dynamics: Schwarz et al. 1993). The most prominent global dynamical models of the energy-release process which comprise entire flares are Cellular Automata (CA) models (Lu and Hamilton 1991, Lu et al. 1993; extended to model nano-flares: Vlahos et al. 1995, Georgoulis and Vlahos 1996; including non-local communications: MacKinnon et al. 1996; an analytic approach: MacKinnon and MacPherson 1997). In these models, the local processes (reconnection) are modeled in a strongly simplified way, by simple evolution rules, so that inhomogeneous active regions can be modeled entirely. Alternatively, Isliker (1996) proposed a shot noise model for flares. This model is able to explain the temporal characteristics of the flare-process, however, it is formal, so-far, it has not been tied to physics, yet. A different class of stochastic models has been proposed to explain the dynamics of the corona as a whole, with randomly occurring flares (Rosner and Vaiana 1978, criticized in Lu 1995b; Litvinenko 1996; a new approach (a master equation for the flare occurrence probability): Wheatland and Glukhov 1998). In this approach, structures within a flare are not resolved, the aim is to explain the occurrence rate and total sizes of flares. The CA models are successful in explaining the distributions of the peak-fluxes, total fluxes, and durations of HXR-emission, which are all power-laws (see references in Aschwanden et al. 1998). In the radio range, peak-flux distributions of generalized power-law and exponential shape are observed, which generally are steeper than in the HXR (type I: Mercier and Trottet (1997); type III, decim. pulsations, nb-spikes: Aschwanden et al. 1998; type III: Isliker and Vlahos 1998; nb-spikes: Isliker and Benz 1998). Since radio-waves can be emitted in low energy events, the steep distributions might be a hint that small flares (micro-flares) have a steep distribution, too, and might therewith substantially contribute to coronal heating. It must be noted, however, that poor time- or frequency-resolution can lead to a steepening of the peak-flux distributions (Isliker and Benz 1998), an effect whose influence on the published events has to be discussed, still. Originally, the evolution rules of the CAs were only loosely motivated through physical considerations and basically taken from the 'sand-pile' paradigm, above all the connection between CA and MHD (the local theory of magnetic reconnection) was missing. Recently, Isliker et al. (1998) have shown that the evolution rules of the CAs correspond to localized, threshold dependent diffusion, implementing directly the solution of a diffusion equation, with unknown diffusivity and scales. Thus, CAs can be interpreted as an implementation of the (simplified) induction equation in a large, inhomogeneous medium. A complete flare model needs to incorporate not just the energy release process, but also the acceleration and transport of particles, as well as the generation of EM-emission. First steps towards this direction are done: Anastasiadis et al. 1997 studied acce

  17. Statistical electron densities

    SciTech Connect

    Pipek, J.; Varga, I.

    1996-12-31

    It is known that in numerous interesting systems one-electron states appear with multifractal internal structure. Physical intuition suggest, however, that electron densities should be smooth both at atomic distances and close to the macroscopic limit. Multifractal behavior is expected at intermediate length scales, with observable non-trivial statistical properties in considerably, but far from macroscopically sized clusters. We have demonstrated that differences of generalized Renyi entropies serve as relevant quantities for the global characterization of the statistical nature of such electron densities. Asymptotic expansion formulas are elaborated for these values as functions of the length scale of observation. The transition from deterministic electron densities to statistical ones along various length of resolution is traced both theoretically and by numerical calculations.

  18. Suite versus composite statistics

    USGS Publications Warehouse

    Balsillie, J.H.; Tanner, W.F.

    1999-01-01

    Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.

  19. Candidate Assembly Statistical Evaluation

    SciTech Connect

    1998-07-15

    The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that a significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.

  20. Candidate Assembly Statistical Evaluation

    Energy Science and Technology Software Center (ESTSC)

    1998-07-15

    The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less

  1. Comparison of a Novel Computerized Analysis Program and Visual Interpretation of Cardiotocography

    PubMed Central

    Chen, Chen-Yu; Yu, Chun; Chang, Chia-Chen; Lin, Chii-Wann

    2014-01-01

    Objective To compare a novel computerized analysis program with visual cardiotocography (CTG) interpretation results. Methods Sixty-two intrapartum CTG tracings with 20- to 30-minute sections were independently interpreted using a novel computerized analysis program, as well as the visual interpretations of eight obstetricians, to evaluate the baseline fetal heart rate (FHR), baseline FHR variability, number of accelerations, number/type of decelerations, uterine contraction (UC) frequency, and the National Institute of Child Health and Human Development (NICHD) 3-Tier FHR classification system. Results There was no significant difference in interobserver variation after adding the components of computerized analysis to results from the obstetricians' visual interpretations, with excellent agreement for the baseline FHR (ICC 0.91), the number of accelerations (ICC 0.85), UC frequency (ICC 0.97), and NICHD category I (kappa statistic 0.91); good agreement for baseline variability (kappa statistic 0.68), the numbers of early decelerations (ICC 0.78) and late decelerations (ICC 0.67), category II (kappa statistic 0.78), and overall categories (kappa statistic 0.80); and moderate agreement for the number of variable decelerations (ICC 0.60), and category III (kappa statistic 0.50). Conclusions This computerized analysis program is not inferior to visual interpretation, may improve interobserver variations, and could play a vital role in prenatal telemedicine. PMID:25437442

  2. Statistical Learning without Attention.

    PubMed

    Yang, Feitong; Flombaum, Jonathan

    2015-09-01

    We sought to investigate the role of attention in statistical learning, an area where current results conflict. Given a stream of shapes including two different colors, and instructed to attend one of the colors (via a cover task), will observers learn statistical regularities associated with the unattended color? Following previous studies, we employed a reaction time (RT) test following encoding. Speeded responses are made to a target shape (on each trial) embedded within an RSVP stream, with learning demonstrated as an RT benefit for second and third triplet items. However, typical procedures repeat the same triplets as fillers and test items, making learning during testing possible. We therefore conducted an experiment with only a test phase (i.e. no incidental exposure), and we found significant RT benefits consistent with statistical learning, even in the first 48 of 96 test trials. These results demonstrate that statistical learning can take place rapidly during the course of procedures that are at times employed to diagnose prior learning. We thus returned to the question of attention with a modified test procedure. In addition to a set of eight triplets shown during incidental exposure, we generated a place-holder-set of 12 additional shapes. Each test trial then included a target shape from one of the learning triplets. It appeared embedded appropriately within its triplet, but with that triplet embedded within a larger set including nine of the 12 place-holder-items. After confirming a lack of statistical learning during the test phase (i.e. without pre-exposure), we used it as the test component for the attended and unattended color experiment described initially. We found significant learning effects for attended and unattended shapes. In addition to furnishing an updated RT test, these results demonstrate the robustness of statistical learning, which arose rapidly and for unattended stimuli. Meeting abstract presented at VSS 2015. PMID:26326580

  3. Interpreting neurodynamics: concepts and facts

    PubMed Central

    Rotter, Stefan

    2008-01-01

    The dynamics of neuronal systems, briefly neurodynamics, has developed into an attractive and influential research branch within neuroscience. In this paper, we discuss a number of conceptual issues in neurodynamics that are important for an appropriate interpretation and evaluation of its results. We demonstrate their relevance for selected topics of theoretical and empirical work. In particular, we refer to the notions of determinacy and stochasticity in neurodynamics across levels of microscopic, mesoscopic and macroscopic descriptions. The issue of correlations between neural, mental and behavioral states is also addressed in some detail. We propose an informed discussion of conceptual foundations with respect to neurobiological results as a viable step to a fruitful future philosophy of neuroscience. PMID:19003452

  4. Interpretation of rapidly rotating pulsars

    SciTech Connect

    Weber, F. . Inst. fuer Theoretische Physik); Glendenning, N.K. )

    1992-08-05

    The minimum possible rotational period of pulsars, which are interpreted as rotating neutron stars, is determined by applying a representative collection of realistic nuclear equations of state. It is found that none of the selected equations of state allows for neutron star rotation at periods below 0.8--0.9 ms. Thus, this work strongly supports the suggestion that if pulsars with shorter rotational periods were found, these are likely to be strange-quark-matter stars. The conclusion that the confined hadronic phase of nucleons and nuclei is only metastable would then be almost inescapable, and the plausible ground-state in that event is the deconfined phase of (3-flavor) strange-quark-matter.

  5. Glaciation of northwestern Wyoming interpreted from ERTS-1

    NASA Technical Reports Server (NTRS)

    Breckenridge, R. M.

    1973-01-01

    Analysis of ERTS Imagery has shown a number of alpine glacial features can be recognized and mapped successfully. Although the Wyoming mountains are generally regarded as the type locality for Rocky Mountain glaciation some areas have not been studied from a glacial standpoint because of inaccessibility or lack of topographic control. ERTS imagery provides an excellent base for this type of regional geomorphic study. A map of maximum extent of Wisconsin Ice, flow directions and major glacial features was compiled from interpretation of the ERTS imagery. Features which can be mapped are large moraines, outwash fans and terraces. Present-day glaciers and snowfields are easily discriminated and mapped. Glaciers and glacial deposits which serve as aquifers play a significant role in the hydrologic cycle and are important because of the increasing demand placed on our water resources. ERTS provides a quick and effective method for change detection and inventory of these vital resources.

  6. QUALITATIVE INTERPRETATION OF GALAXY SPECTRA

    SciTech Connect

    Sanchez Almeida, J.; Morales-Luis, A. B.; Terlevich, R.; Terlevich, E.; Cid Fernandes, R. E-mail: abml@iac.es E-mail: eterlevi@inaoep.mx

    2012-09-10

    We describe a simple step-by-step guide to qualitative interpretation of galaxy spectra. Rather than an alternative to existing automated tools, it is put forward as an instrument for quick-look analysis and for gaining physical insight when interpreting the outputs provided by automated tools. Though the recipe is for general application, it was developed for understanding the nature of the Automatic Spectroscopic K-means-based (ASK) template spectra. They resulted from the classification of all the galaxy spectra in the Sloan Digital Sky Survey data release 7, thus being a comprehensive representation of the galaxy spectra in the local universe. Using the recipe, we give a description of the properties of the gas and the stars that characterize the ASK classes, from those corresponding to passively evolving galaxies, to H II galaxies undergoing a galaxy-wide starburst. The qualitative analysis is found to be in excellent agreement with quantitative analyses of the same spectra. We compare the mean ages of the stellar populations with those inferred using the code STARLIGHT. We also examine the estimated gas-phase metallicity with the metallicities obtained using electron-temperature-based methods. A number of byproducts follow from the analysis. There is a tight correlation between the age of the stellar population and the metallicity of the gas, which is stronger than the correlations between galaxy mass and stellar age, and galaxy mass and gas metallicity. The galaxy spectra are known to follow a one-dimensional sequence, and we identify the luminosity-weighted mean stellar age as the affine parameter that describes the sequence. All ASK classes happen to have a significant fraction of old stars, although spectrum-wise they are outshined by the youngest populations. Old stars are metal-rich or metal-poor depending on whether they reside in passive galaxies or in star-forming galaxies.

  7. CONTEMPORARY ENVIRONMENTAL APPLICATIONS OF PHOTOGRAPHIC INTERPRETATION

    EPA Science Inventory

    Aerial Photographic Interpretation is a timed-tested technique for extracting landscape- level information from aerial photographs and other types of remotely sensed images. The U.S. Environmental Protection Agency's Environmental Photographic Interpretation Center (EPIC) has a 2...

  8. Weighted order statistic classifiers with large rank-order margin.

    SciTech Connect

    Porter, R. B.; Hush, D. R.; Theiler, J. P.; Gokhale, M.

    2003-01-01

    We describe how Stack Filters and Weighted Order Statistic function classes can be used for classification problems. This leads to a new design criteria for linear classifiers when inputs are binary-valued and weights are positive . We present a rank-based measure of margin that can be directly optimized as a standard linear program and investigate its effect on generalization error with experiment. Our approach can robustly combine large numbers of base hypothesis and easily implement known priors through regularization.

  9. The Extended Statistical Analysis of Toxicity Tests Using Standardised Effect Sizes (SESs): A Comparison of Nine Published Papers

    PubMed Central

    Festing, Michael F. W.

    2014-01-01

    The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a “statistically significant” effect is real or a false positive (type I error) due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p?=?0.05. However, by using standardised effect sizes (SESs) a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A “bootstrap” test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL) was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated. PMID:25426843

  10. Proteny: discovering and visualizing statistically significant syntenic clusters at the proteome level

    PubMed Central

    Gehrmann, Thies; Reinders, Marcel J.T.

    2015-01-01

    Background: With more and more genomes being sequenced, detecting synteny between genomes becomes more and more important. However, for microorganisms the genomic divergence quickly becomes large, resulting in different codon usage and shuffling of gene order and gene elements such as exons. Results: We present Proteny, a methodology to detect synteny between diverged genomes. It operates on the amino acid sequence level to be insensitive to codon usage adaptations and clusters groups of exons disregarding order to handle diversity in genomic ordering between genomes. Furthermore, Proteny assigns significance levels to the syntenic clusters such that they can be selected on statistical grounds. Finally, Proteny provides novel ways to visualize results at different scales, facilitating the exploration and interpretation of syntenic regions. We test the performance of Proteny on a standard ground truth dataset, and we illustrate the use of Proteny on two closely related genomes (two different strains of Aspergillus niger) and on two distant genomes (two species of Basidiomycota). In comparison to other tools, we find that Proteny finds clusters with more true homologies in fewer clusters that contain more genes, i.e. Proteny is able to identify a more consistent synteny. Further, we show how genome rearrangements, assembly errors, gene duplications and the conservation of specific genes can be easily studied with Proteny. Availability and implementation: Proteny is freely available at the Delft Bioinformatics Lab website http://bioinformatics.tudelft.nl/dbl/software. Contact: t.gehrmann@tudelft.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26116928

  11. Multivariate statistical analysis strategy for multiple misfire detection in internal combustion engines

    NASA Astrophysics Data System (ADS)

    Hu, Chongqing; Li, Aihua; Zhao, Xingyang

    2011-02-01

    This paper proposes a multivariate statistical analysis approach to processing the instantaneous engine speed signal for the purpose of locating multiple misfire events in internal combustion engines. The state of each cylinder is described with a characteristic vector extracted from the instantaneous engine speed signal following a three-step procedure. These characteristic vectors are considered as the values of various procedure parameters of an engine cycle. Therefore, determination of occurrence of misfire events and identification of misfiring cylinders can be accomplished by a principal component analysis (PCA) based pattern recognition methodology. The proposed algorithm can be implemented easily in practice because the threshold can be defined adaptively without the information of operating conditions. Besides, the effect of torsional vibration on the engine speed waveform is interpreted as the presence of super powerful cylinder, which is also isolated by the algorithm. The misfiring cylinder and the super powerful cylinder are often adjacent in the firing sequence, thus missing detections and false alarms can be avoided effectively by checking the relationship between the cylinders.

  12. FIR statistics of paired galaxies

    NASA Astrophysics Data System (ADS)

    Sulentic, Jack W.

    1990-11-01

    Much progress has been made in understanding the effects of interaction on galaxies (see reviews in this volume by Heckman and Kennicutt). Evidence for enhanced emission from galaxies in pairs first emerged in the radio (Sulentic 1976) and optical (Larson and Tinsley 1978) domains. Results in the far infrared (FIR) lagged behind until the advent of the Infrared Astronomy Satellite (IRAS). The last five years have seen numerous FIR studies of optical and IR selected samples of interacting galaxies (e.g., Cutri and McAlary 1985; Joseph and Wright 1985; Kennicutt et al. 1987; Haynes and Herter 1988). Despite all of this work, there are still contradictory ideas about the level and, even, the reality of an FIR enhancement in interacting galaxies. Much of the confusion originates in differences between the galaxy samples that were studied (i.e., optical morphology and redshift coverage). Here, the authors report on a study of the FIR detection properties for a large sample of interacting galaxies and a matching control sample. They focus on the distance independent detection fraction (DF) statistics of the sample. The results prove useful in interpreting the previously published work. A clarification of the phenomenology provides valuable clues about the physics of the FIR enhancement in galaxies.

  13. FIR statistics of paired galaxies

    NASA Technical Reports Server (NTRS)

    Sulentic, Jack W.

    1990-01-01

    Much progress has been made in understanding the effects of interaction on galaxies (see reviews in this volume by Heckman and Kennicutt). Evidence for enhanced emission from galaxies in pairs first emerged in the radio (Sulentic 1976) and optical (Larson and Tinsley 1978) domains. Results in the far infrared (FIR) lagged behind until the advent of the Infrared Astronomy Satellite (IRAS). The last five years have seen numerous FIR studies of optical and IR selected samples of interacting galaxies (e.g., Cutri and McAlary 1985; Joseph and Wright 1985; Kennicutt et al. 1987; Haynes and Herter 1988). Despite all of this work, there are still contradictory ideas about the level and, even, the reality of an FIR enhancement in interacting galaxies. Much of the confusion originates in differences between the galaxy samples that were studied (i.e., optical morphology and redshift coverage). Here, the authors report on a study of the FIR detection properties for a large sample of interacting galaxies and a matching control sample. They focus on the distance independent detection fraction (DF) statistics of the sample. The results prove useful in interpreting the previously published work. A clarification of the phenomenology provides valuable clues about the physics of the FIR enhancement in galaxies.

  14. Statistical model with a standard ? distribution

    NASA Astrophysics Data System (ADS)

    Patriarca, Marco; Chakraborti, Anirban; Kaski, Kimmo

    2004-07-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter ? . We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity ? . Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T(?) , where particles exchange energy in a space with an effective dimension D(?) .

  15. 10 CFR 63.5 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Interpretations. 63.5 Section 63.5 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA General Provisions § 63.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of...

  16. 10 CFR 20.1006 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 20.1006 Section 20.1006 Energy NUCLEAR REGULATORY COMMISSION STANDARDS FOR PROTECTION AGAINST RADIATION General Provisions § 20.1006 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of...

  17. Comprehension and Error Monitoring in Simultaneous Interpreters

    ERIC Educational Resources Information Center

    Yudes, Carolina; Macizo, Pedro; Morales, Luis; Bajo, M. Teresa

    2013-01-01

    In the current study we explored lexical, syntactic, and semantic processes during text comprehension in English monolinguals and Spanish/English (first language/second language) bilinguals with different experience in interpreting (nontrained bilinguals, interpreting students and professional interpreters). The participants performed an…

  18. 12 CFR 907.5 - Regulatory Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Regulatory Interpretations. 907.5 Section 907.5... OPERATIONS PROCEDURES Waivers, Approvals, No-Action Letters, and Regulatory Interpretations § 907.5 Regulatory Interpretations. (a) Authority. Finance Board staff, in its discretion, may issue a...

  19. 12 CFR 907.5 - Regulatory Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Regulatory Interpretations. 907.5 Section 907.5... OPERATIONS PROCEDURES Waivers, Approvals, No-Action Letters, and Regulatory Interpretations § 907.5 Regulatory Interpretations. (a) Authority. Finance Board staff, in its discretion, may issue a...

  20. 12 CFR 907.5 - Regulatory Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Regulatory Interpretations. 907.5 Section 907.5... OPERATIONS PROCEDURES Waivers, Approvals, No-Action Letters, and Regulatory Interpretations § 907.5 Regulatory Interpretations. (a) Authority. Finance Board staff, in its discretion, may issue a...

  1. 12 CFR 907.5 - Regulatory Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Regulatory Interpretations. 907.5 Section 907.5... OPERATIONS PROCEDURES Waivers, Approvals, No-Action Letters, and Regulatory Interpretations § 907.5 Regulatory Interpretations. (a) Authority. Finance Board staff, in its discretion, may issue a...

  2. 12 CFR 907.5 - Regulatory Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Regulatory Interpretations. 907.5 Section 907.5... OPERATIONS PROCEDURES Waivers, Approvals, No-Action Letters, and Regulatory Interpretations § 907.5 Regulatory Interpretations. (a) Authority. Finance Board staff, in its discretion, may issue a...

  3. 12 CFR 609.920 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Interpretations. 609.920 Section 609.920 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM ELECTRONIC COMMERCE Interpretations and Definitions § 609.920 Interpretations. (a) E-SIGN preempts most statutes and regulations, including the Act and its implementing regulations...

  4. 12 CFR 609.920 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Interpretations. 609.920 Section 609.920 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM ELECTRONIC COMMERCE Interpretations and Definitions § 609.920 Interpretations. (a) E-SIGN preempts most statutes and regulations, including the Act and its implementing regulations...

  5. 12 CFR 609.920 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Interpretations. 609.920 Section 609.920 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM ELECTRONIC COMMERCE Interpretations and Definitions § 609.920 Interpretations. (a) E-SIGN preempts most statutes and regulations, including the Act and its implementing regulations...

  6. The transactional interpretation of quantum mechanics

    NASA Astrophysics Data System (ADS)

    Cramer, John G.

    1986-07-01

    The interpretational problems of quantum mechanics are considered. The way in which the standard Copenhagen interpretation of quantum mechanics deals with these problems is reviewed. A new interpretation of the formalism of quantum mechanics, the transactional interpretation, is presented. The basic element of this interpretation is the transaction describing a quantum event as an exchange of advanced and retarded waves, as implied by the work of Wheeler and Feynman, Dirac, and others. The transactional interpretation is explicitly nonlocal and thereby consistent with recent tests of the Bell inequality, yet is relativistically invariant and fully causal. A detailed comparison of the transactional and Copenhagen interpretations is made in the context of well-known quantum-mechanical Gedankenexperimente and "paradoxes." The transactional interpretation permits quantum-mechanical wave functions to be interpreted as real waves physically present in space rather than as "mathematical representations of knowledge" as in the Copenhagen interpretation. The transactional interpretation is shown to provide insight into the complex character of the quantum-mechanical state vector and the mechanism associated with its "collapse." It also leads in a natural way to justification of the Heisenberg uncertainty principle and the Born probability law (P=ψψ*), basic elements of the Copenhagen interpretation.

  7. An Online Synchronous Test for Professional Interpreters

    ERIC Educational Resources Information Center

    Chen, Nian-Shing; Ko, Leong

    2010-01-01

    This article is based on an experiment designed to conduct an interpreting test for multiple candidates online, using web-based synchronous cyber classrooms. The test model was based on the accreditation test for Professional Interpreters produced by the National Accreditation Authority of Translators and Interpreters (NAATI) in Australia.…

  8. 10 CFR 50.3 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 50.3 Section 50.3 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF PRODUCTION AND UTILIZATION FACILITIES General Provisions § 50.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  9. 12 CFR 591.6 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 6 2013-01-01 2012-01-01 true Interpretations. 591.6 Section 591.6 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY PREEMPTION OF STATE DUE-ON-SALE LAWS § 591.6 Interpretations. The Office periodically will publish Interpretations under section 341 of...

  10. 12 CFR 591.6 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 5 2011-01-01 2011-01-01 false Interpretations. 591.6 Section 591.6 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY PREEMPTION OF STATE DUE-ON-SALE LAWS § 591.6 Interpretations. The Office periodically will publish Interpretations under section 341 of...

  11. 12 CFR 191.6 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 1 2012-01-01 2012-01-01 false Interpretations. 191.6 Section 191.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY PREEMPTION OF STATE DUE-ON-SALE LAWS § 191.6 Interpretations. The OCC periodically will publish Interpretations under section 341 of the...

  12. 12 CFR 191.6 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 1 2013-01-01 2013-01-01 false Interpretations. 191.6 Section 191.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY PREEMPTION OF STATE DUE-ON-SALE LAWS § 191.6 Interpretations. The OCC periodically will publish Interpretations under section 341 of the...

  13. Sex differences in interpretation bias in adolescents.

    PubMed

    Gluck, Rachel L; Lynn, Debra A; Dritschel, Barbara; Brown, Gillian R

    2014-03-01

    Interpretation biases, in which ambiguous information is interpreted negatively, have been hypothesized to place adolescent females at greater risk of developing anxiety and mood disorders than same-aged males. We tested the hypothesis that adolescent girls interpret ambiguous scenarios more negatively, and/or less positively, than same-aged males using the Adolescent Interpretation and Belief Questionnaire (N = 67, 11-15 years old). We also tested whether adolescent girls and boys differed in judging positive or negative interpretations to be more believable and whether the scenario content (social vs. non-social) affected any sex difference in interpretation bias. The results showed that girls had higher average negative interpretation scores than boys, with no sex differences in positive interpretation scores. Girls and boys did not differ on which interpretation they found to be most believable. Both sexes reported that positive interpretations were less likely to come to mind, and were less believable, for social than for non-social scenarios. These results provide preliminary evidence for sex differences in interpretation biases in adolescence and support the hypothesis that social scenarios are a specific source of anxiety to this age group. A greater understanding of the aetiology of interpretation biases will potentially enhance sex- and age-specific interventions for anxiety and mood disorders. PMID:24417225

  14. 10 CFR 76.6 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  15. 10 CFR 76.6 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  16. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  17. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  18. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  19. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  20. 10 CFR 39.5 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  1. 10 CFR 1016.7 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Interpretations. 1016.7 Section 1016.7 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA General Provisions § 1016.7 Interpretations. Except as specifically authorized by the Secretary of Energy in writing, no interpretation of...

  2. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  3. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  4. 10 CFR 7.3 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 7.3 Section 7.3 Energy NUCLEAR REGULATORY COMMISSION ADVISORY COMMITTEES § 7.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an NRC officer or...

  5. 10 CFR 110.3 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 110.3 Section 110.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL General Provisions § 110.3 Interpretations. Except as authorized by the Commission in writing, no interpretation of...

  6. 10 CFR 7.3 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 7.3 Section 7.3 Energy NUCLEAR REGULATORY COMMISSION ADVISORY COMMITTEES § 7.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an NRC officer or...

  7. 10 CFR 110.3 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Interpretations. 110.3 Section 110.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL General Provisions § 110.3 Interpretations. Except as authorized by the Commission in writing, no interpretation of...

  8. 10 CFR 25.7 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 25.7 Section 25.7 Energy NUCLEAR REGULATORY COMMISSION ACCESS AUTHORIZATION General Provisions § 25.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part...

  9. 10 CFR 110.3 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Interpretations. 110.3 Section 110.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL General Provisions § 110.3 Interpretations. Except as authorized by the Commission in writing, no interpretation of...

  10. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  11. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...

  12. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...

  13. 10 CFR 110.3 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 110.3 Section 110.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) EXPORT AND IMPORT OF NUCLEAR EQUIPMENT AND MATERIAL General Provisions § 110.3 Interpretations. Except as authorized by the Commission in writing, no interpretation of...

  14. 10 CFR 7.3 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 7.3 Section 7.3 Energy NUCLEAR REGULATORY COMMISSION ADVISORY COMMITTEES § 7.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an NRC officer or...

  15. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...

  16. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  17. 10 CFR 70.6 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 70.6 Section 70.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DOMESTIC LICENSING OF SPECIAL NUCLEAR MATERIAL General Provisions § 70.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...

  18. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...

  19. 10 CFR 7.3 - Interpretations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 7.3 Section 7.3 Energy NUCLEAR REGULATORY COMMISSION ADVISORY COMMITTEES § 7.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an NRC officer or...

  20. 10 CFR 9.5 - Interpretations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 9.5 Section 9.5 Energy NUCLEAR REGULATORY COMMISSION PUBLIC RECORDS § 9.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in this part by an officer or employee of...