The Malpractice of Statistical Interpretation
ERIC Educational Resources Information Center
Fraas, John W.; Newman, Isadore
1978-01-01
Problems associated with the use of gain scores, analysis of covariance, multicollinearity, part and partial correlation, and the lack of rectilinearity in regression are discussed. Particular attention is paid to the misuse of statistical techniques. (JKS)
Statistical weld process monitoring with expert interpretation
Cook, G.E.; Barnett, R.J.; Strauss, A.M.; Thompson, F.M. Jr.
1996-12-31
A statistical weld process monitoring system is described. Using data of voltage, current, wire feed speed, gas flow rate, travel speed, and elapsed arc time collected while welding, the welding statistical process control (SPC) tool provides weld process quality control by implementing techniques of data trending analysis, tolerance analysis, and sequential analysis. For purposes of quality control, the control limits required for acceptance are specified in the weld procedure acceptance specifications. The control charts then provide quality assurance documentation for each weld. The statistical data trending analysis performed by the SPC program is not only valuable as a quality assurance monitoring and documentation system, it is also valuable in providing diagnostic assistance in troubleshooting equipment and material problems. Possible equipment/process problems are identified and matched with features of the SPC control charts. To aid in interpreting the voluminous statistical output generated by the SPC system, a large number of If-Then rules have been devised for providing computer-based expert advice for pinpointing problems based on out-of-limit variations of the control charts. The paper describes the SPC monitoring tool and the rule-based expert interpreter that has been developed for relating control chart trends to equipment/process problems.
Vickers, Andrew J
2005-01-01
Analysis of variance (ANOVA) is a statistical method that is widely used in the psychosomatic literature to analyze the results of randomized trials, yet ANOVA does not provide an estimate for the difference between groups, the key variable of interest in a randomized trial. Although the use of ANOVA is frequently justified on the grounds that a trial incorporates more than two groups, the hypothesis tested by ANOVA for these trials--"Are all groups equivalent?"--is often scientifically uninteresting. Regression methods are not only applicable to trials with many groups, but can be designed to address specific questions arising from the study design. ANOVA is also frequently used for trials with repeated measures, but the consequent reporting of "group effects," "time effects," and "time-by-group interactions" is a distraction from statistics of clinical and scientific value. Given that ANOVA is easily misapplied in the analysis of randomized trials, alternative approaches such as regression methods should be considered in preference.
Kar, Supratik; Gajewicz, Agnieszka; Puzyn, Tomasz; Roy, Kunal
2014-06-01
As experimental evaluation of the safety of nanoparticles (NPs) is expensive and time-consuming, computational approaches have been found to be an efficient alternative for predicting the potential toxicity of new NPs before mass production. In this background, we have developed here a regression-based nano quantitative structure-activity relationship (nano-QSAR) model to establish statistically significant relationships between the measured cellular uptakes of 109 magnetofluorescent NPs in pancreatic cancer cells with their physical, chemical, and structural properties encoded within easily computable, interpretable and reproducible descriptors. The developed model was rigorously validated internally as well as externally with the application of the principles of Organization for Economic Cooperation and Development (OECD). The test for domain of applicability was also carried out for checking reliability of the predictions. Important fragments contributing to higher/lower cellular uptake of NPs were identified through critical analysis and interpretation of the developed model. Considering all these identified structural attributes, one can choose or design safe, economical and suitable surface modifiers for NPs. The presented approach provides rich information in the context of virtual screening of relevant NP libraries.
Kar, Supratik; Gajewicz, Agnieszka; Puzyn, Tomasz; Roy, Kunal
2014-06-01
As experimental evaluation of the safety of nanoparticles (NPs) is expensive and time-consuming, computational approaches have been found to be an efficient alternative for predicting the potential toxicity of new NPs before mass production. In this background, we have developed here a regression-based nano quantitative structure-activity relationship (nano-QSAR) model to establish statistically significant relationships between the measured cellular uptakes of 109 magnetofluorescent NPs in pancreatic cancer cells with their physical, chemical, and structural properties encoded within easily computable, interpretable and reproducible descriptors. The developed model was rigorously validated internally as well as externally with the application of the principles of Organization for Economic Cooperation and Development (OECD). The test for domain of applicability was also carried out for checking reliability of the predictions. Important fragments contributing to higher/lower cellular uptake of NPs were identified through critical analysis and interpretation of the developed model. Considering all these identified structural attributes, one can choose or design safe, economical and suitable surface modifiers for NPs. The presented approach provides rich information in the context of virtual screening of relevant NP libraries. PMID:24412539
Statistical interpretation of “femtomolar” detection
Go, Jonghyun; Alam, Muhammad A.
2009-01-01
We calculate the statistics of diffusion-limited arrival-time distribution by a Monte Carlo method to suggest a simple statistical resolution of the enduring puzzle of nanobiosensors: a persistent gap between reports of analyte detection at approximately femtomolar concentration and theory suggesting the impossibility of approximately subpicomolar detection at the corresponding incubation time. The incubation time used in the theory is actually the mean incubation time, while experimental conditions suggest that device stability limited the minimum incubation time. The difference in incubation times—both described by characteristic power laws—provides an intuitive explanation of different detection limits anticipated by theory and experiments. PMID:19690630
Max Born's Statistical Interpretation of Quantum Mechanics.
Pais, A
1982-12-17
In the summer of 1926, a statistical element was introduced for the first time in the fundamental laws of physics in two papers by Born. After a brief account of Born's earlier involvements with quantum physics, including his bringing the new mechanics to the United States, the motivation for and contents of Born's two papers are discussed. The reaction of his colleagues is described.
The Statistical Interpretation of Entropy: An Activity
ERIC Educational Resources Information Center
Timmberlake, Todd
2010-01-01
The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…
The Statistical Interpretation of Entropy: An Activity
NASA Astrophysics Data System (ADS)
Timmberlake, Todd
2010-11-01
The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the functioning of the second law and also provided evidence for the existence of atoms at a time when many scientists (like Ernst Mach and Wilhelm Ostwald) were skeptical.
For a statistical interpretation of Helmholtz' thermal displacement
NASA Astrophysics Data System (ADS)
Podio-Guidugli, Paolo
2016-11-01
On moving from the classic papers by Einstein and Langevin on Brownian motion, two consistent statistical interpretations are given for the thermal displacement, a scalar field formally introduced by Helmholtz, whose time derivative is by definition the absolute temperature.
The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes
ERIC Educational Resources Information Center
Cartier, Stephen F.
2011-01-01
A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…
Pass-Fail Testing: Statistical Requirements and Interpretations
Gilliam, David; Leigh, Stefan; Rukhin, Andrew; Strawderman, William
2009-01-01
Performance standards for detector systems often include requirements for probability of detection and probability of false alarm at a specified level of statistical confidence. This paper reviews the accepted definitions of confidence level and of critical value. It describes the testing requirements for establishing either of these probabilities at a desired confidence level. These requirements are computable in terms of functions that are readily available in statistical software packages and general spreadsheet applications. The statistical interpretations of the critical values are discussed. A table is included for illustration, and a plot is presented showing the minimum required numbers of pass-fail tests. The results given here are applicable to one-sided testing of any system with performance characteristics conforming to a binomial distribution. PMID:27504221
Pass-Fail Testing: Statistical Requirements and Interpretations.
Gilliam, David; Leigh, Stefan; Rukhin, Andrew; Strawderman, William
2009-01-01
Performance standards for detector systems often include requirements for probability of detection and probability of false alarm at a specified level of statistical confidence. This paper reviews the accepted definitions of confidence level and of critical value. It describes the testing requirements for establishing either of these probabilities at a desired confidence level. These requirements are computable in terms of functions that are readily available in statistical software packages and general spreadsheet applications. The statistical interpretations of the critical values are discussed. A table is included for illustration, and a plot is presented showing the minimum required numbers of pass-fail tests. The results given here are applicable to one-sided testing of any system with performance characteristics conforming to a binomial distribution.
Statistical Interpretation of Natural and Technological Hazards in China
NASA Astrophysics Data System (ADS)
Borthwick, Alistair, ,, Prof.; Ni, Jinren, ,, Prof.
2010-05-01
China is prone to catastrophic natural hazards from floods, droughts, earthquakes, storms, cyclones, landslides, epidemics, extreme temperatures, forest fires, avalanches, and even tsunami. This paper will list statistics related to the six worst natural disasters in China over the past 100 or so years, ranked according to number of fatalities. The corresponding data for the six worst natural disasters in China over the past decade will also be considered. [The data are abstracted from the International Disaster Database, Centre for Research on the Epidemiology of Disasters (CRED), Université Catholique de Louvain, Brussels, Belgium, http://www.cred.be/ where a disaster is defined as occurring if one of the following criteria is fulfilled: 10 or more people reported killed; 100 or more people reported affected; a call for international assistance; or declaration of a state of emergency.] The statistics include the number of occurrences of each type of natural disaster, the number of deaths, the number of people affected, and the cost in billions of US dollars. Over the past hundred years, the largest disasters may be related to the overabundance or scarcity of water, and to earthquake damage. However, there has been a substantial relative reduction in fatalities due to water related disasters over the past decade, even though the overall numbers of people affected remain huge, as does the economic damage. This change is largely due to the efforts put in by China's water authorities to establish effective early warning systems, the construction of engineering countermeasures for flood protection, the implementation of water pricing and other measures for reducing excessive consumption during times of drought. It should be noted that the dreadful death toll due to the Sichuan Earthquake dominates recent data. Joint research has been undertaken between the Department of Environmental Engineering at Peking University and the Department of Engineering Science at Oxford
A Critique of Divorce Statistics and Their Interpretation.
ERIC Educational Resources Information Center
Crosby, John F.
1980-01-01
Increasingly, appeals to the divorce statistic are employed to substantiate claims that the family is in a state of breakdown and marriage is passe. This article contains a consideration of reasons why the divorce statistics are invalid and/or unreliable as indicators of the present state of marriage and family. (Author)
Workplace Statistical Literacy for Teachers: Interpreting Box Plots
ERIC Educational Resources Information Center
Pierce, Robyn; Chick, Helen
2013-01-01
As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…
Statistical characteristics of MST radar echoes and its interpretation
NASA Technical Reports Server (NTRS)
Woodman, Ronald F.
1989-01-01
Two concepts of fundamental importance are reviewed: the autocorrelation function and the frequency power spectrum. In addition, some turbulence concepts, the relationship between radar signals and atmospheric medium statistics, partial reflection, and the characteristics of noise and clutter interference are discussed.
Statistical Interpretation of the Local Field Inside Dielectrics.
ERIC Educational Resources Information Center
Berrera, Ruben G.; Mello, P. A.
1982-01-01
Compares several derivations of the Clausius-Mossotti relation to analyze consistently the nature of approximations used and their range of applicability. Also presents a statistical-mechanical calculation of the local field for classical system of harmonic oscillators interacting via the Coulomb potential. (Author/SK)
Variation in reaction norms: Statistical considerations and biological interpretation.
Morrissey, Michael B; Liefting, Maartje
2016-09-01
Analysis of reaction norms, the functions by which the phenotype produced by a given genotype depends on the environment, is critical to studying many aspects of phenotypic evolution. Different techniques are available for quantifying different aspects of reaction norm variation. We examine what biological inferences can be drawn from some of the more readily applicable analyses for studying reaction norms. We adopt a strongly biologically motivated view, but draw on statistical theory to highlight strengths and drawbacks of different techniques. In particular, consideration of some formal statistical theory leads to revision of some recently, and forcefully, advocated opinions on reaction norm analysis. We clarify what simple analysis of the slope between mean phenotype in two environments can tell us about reaction norms, explore the conditions under which polynomial regression can provide robust inferences about reaction norm shape, and explore how different existing approaches may be used to draw inferences about variation in reaction norm shape. We show how mixed model-based approaches can provide more robust inferences than more commonly used multistep statistical approaches, and derive new metrics of the relative importance of variation in reaction norm intercepts, slopes, and curvatures.
A statistical model for interpreting computerized dynamic posturography data
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.
2002-01-01
Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.
Inferring the statistical interpretation of quantum mechanics from the classical limit
Gottfried
2000-06-01
It is widely believed that the statistical interpretation of quantum mechanics cannot be inferred from the Schrodinger equation itself, and must be stated as an additional independent axiom. Here I propose that the situation is not so stark. For systems that have both continuous and discrete degrees of freedom (such as coordinates and spin respectively), the statistical interpretation for the discrete variables is implied by requiring that the system's gross motion can be classically described under circumstances specified by the Schrodinger equation. However, this is not a full-fledged derivation of the statistical interpretation because it does not apply to the continuous variables of classical mechanics.
Instruments, methods, statistics, plasmaphysical interpretation of type IIIb bursts
NASA Astrophysics Data System (ADS)
Urbarz, H. W.
Type-IIIb solar bursts in the m-dkm band and the methods used to study them are characterized in a review of recent research. The value of high-resolution spectrographs (with effective apertures of 1000-100,000 sq m, frequency resolution 20 kHz, and time resolution 100 msec) in detecting and investigating type-IIIb bursts is emphasized, and the parameters of the most important instruments are listed in a table. Burst spectra, sources, polarization, flux, occurrence, and association with other types are discussed and illustrated with sample spectra, tables, and histograms. The statistics of observations made at Weissenau Observatory (Tuebingen, FRG) from August, 1978, through December, 1979, are considered in detail. Theories proposed to explain type-III and type-IIIb bursts are summarized, including frequency splitting (FS) of the Langmuir spectrum, FS during the transverse-wave conversion process, FS during propagation-effect transverse-wave escape, and discrete source regions with different f(p) values.
Impact of Equity Models and Statistical Measures on Interpretations of Educational Reform
ERIC Educational Resources Information Center
Rodriguez, Idaykis; Brewe, Eric; Sawtelle, Vashti; Kramer, Laird H.
2012-01-01
We present three models of equity and show how these, along with the statistical measures used to evaluate results, impact interpretation of equity in education reform. Equity can be defined and interpreted in many ways. Most equity education reform research strives to achieve equity by closing achievement gaps between groups. An example is given…
NASA Astrophysics Data System (ADS)
Kuić, Domagoj
2016-05-01
In this paper an alternative approach to statistical mechanics based on the maximum information entropy principle (MaxEnt) is examined, specifically its close relation with the Gibbs method of ensembles. It is shown that the MaxEnt formalism is the logical extension of the Gibbs formalism of equilibrium statistical mechanics that is entirely independent of the frequentist interpretation of probabilities only as factual (i.e. experimentally verifiable) properties of the real world. Furthermore, we show that, consistently with the law of large numbers, the relative frequencies of the ensemble of systems prepared under identical conditions (i.e. identical constraints) actually correspond to the MaxEnt probabilites in the limit of a large number of systems in the ensemble. This result implies that the probabilities in statistical mechanics can be interpreted, independently of the frequency interpretation, on the basis of the maximum information entropy principle.
The early statistical interpretations of quantum mechanics in the USA and USSR
NASA Astrophysics Data System (ADS)
Pechenkin, Alexander
2012-02-01
This article is devoted to the statistical (ensemble) interpretations of quantum mechanics which appeared in the USA and USSR before War II and in the early war years. The author emphasizes a remarkable similarity between the statements which arose in different scientific, philosophical, and even political contexts. The comparative analysis extends to the scientific and philosophical traditions which lay behind the American and Soviet statistical interpretations of quantum mechanics. The author insists that the philosophy of quantum mechanics is an autonomous branch rather than an applied philosophy or philosophical physics.
Statistical Tools for the Interpretation of Enzootic West Nile virus Transmission Dynamics.
Caillouët, Kevin A; Robertson, Suzanne
2016-01-01
Interpretation of enzootic West Nile virus (WNV) surveillance indicators requires little advanced mathematical skill, but greatly enhances the ability of public health officials to prescribe effective WNV management tactics. Stepwise procedures for the calculation of mosquito infection rates (IR) and vector index (VI) are presented alongside statistical tools that require additional computation. A brief review of advantages and important considerations for each statistic's use is provided. PMID:27188561
Material Phase Causality or a Dynamics-Statistical Interpretation of Quantum Mechanics
Koprinkov, I. G.
2010-11-25
The internal phase dynamics of a quantum system interacting with an electromagnetic field is revealed in details. Theoretical and experimental evidences of a causal relation of the phase of the wave function to the dynamics of the quantum system are presented sistematically for the first time. A dynamics-statistical interpretation of the quantum mechanics is introduced.
The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical, and statistical perspectives (Pleil et al. 2014; Sobus et al. 2011...
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
ERIC Educational Resources Information Center
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate unless "corrected" effect…
On the Interpretation of Running Trends as Summary Statistics for Time Series Analysis
NASA Astrophysics Data System (ADS)
Vigo, Isabel M.; Trottini, Mario; Belda, Santiago
2016-04-01
In recent years, running trends analysis (RTA) has been widely used in climate applied research as summary statistics for time series analysis. There is no doubt that RTA might be a useful descriptive tool, but despite its general use in applied research, precisely what it reveals about the underlying time series is unclear and, as a result, its interpretation is unclear too. This work contributes to such interpretation in two ways: 1) an explicit formula is obtained for the set of time series with a given series of running trends, making it possible to show that running trends, alone, perform very poorly as summary statistics for time series analysis; and 2) an equivalence is established between RTA and the estimation of a (possibly nonlinear) trend component of the underlying time series using a weighted moving average filter. Such equivalence provides a solid ground for RTA implementation and interpretation/validation.
Dotto, G L; Pinto, L A A; Hachicha, M A; Knani, S
2015-03-15
In this work, statistical physics treatment was employed to study the adsorption of food dyes onto chitosan films, in order to obtain new physicochemical interpretations at molecular level. Experimental equilibrium curves were obtained for the adsorption of four dyes (FD&C red 2, FD&C yellow 5, FD&C blue 2, Acid Red 51) at different temperatures (298, 313 and 328 K). A statistical physics formula was used to interpret these curves, and the parameters such as, number of adsorbed dye molecules per site (n), anchorage number (n'), receptor sites density (NM), adsorbed quantity at saturation (N asat), steric hindrance (τ), concentration at half saturation (c1/2) and molar adsorption energy (ΔE(a)) were estimated. The relation of the above mentioned parameters with the chemical structure of the dyes and temperature was evaluated and interpreted.
Impact of equity models and statistical measures on interpretations of educational reform
NASA Astrophysics Data System (ADS)
Rodriguez, Idaykis; Brewe, Eric; Sawtelle, Vashti; Kramer, Laird H.
2012-12-01
We present three models of equity and show how these, along with the statistical measures used to evaluate results, impact interpretation of equity in education reform. Equity can be defined and interpreted in many ways. Most equity education reform research strives to achieve equity by closing achievement gaps between groups. An example is given by the study by Lorenzo et al. that shows that interactive engagement methods lead to increased gender equity. In this paper, we reexamine the results of Lorenzo et al. through three models of equity. We find that interpretation of the results strongly depends on the model of equity chosen. Further, we argue that researchers must explicitly state their model of equity as well as use effect size measurements to promote clarity in education reform.
Alternative interpretations of statistics on health effects of low-level radiation
Hamilton, L.D.
1983-11-01
Four examples of the interpretation of statistics of data on low-level radiation are reviewed: (a) genetic effects of the atomic bombs at Hiroshima and Nagasaki, (b) cancer at Rocky Flats, (c) childhood leukemia and fallout in Utah, and (d) cancer among workers at the Portsmouth Naval Shipyard. Aggregation of data, adjustment for age, and other problems related to the determination of health effects of low-level radiation are discussed. Troublesome issues related to post hoc analysis are considered.
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
NASA Astrophysics Data System (ADS)
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
Two Easily Made Astronomical Telescopes.
ERIC Educational Resources Information Center
Hill, M.; Jacobs, D. J.
1991-01-01
The directions and diagrams for making a reflecting telescope and a refracting telescope are presented. These telescopes can be made by students out of plumbing parts and easily obtainable, inexpensive, optical components. (KR)
Bean, Heather D; Pleil, Joachim D; Hill, Jane E
2015-02-01
The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical and statistical perspectives. A second concern is how the environment interacts with human systems biology, what the variability is in "normal" subjects, and how such biological observations might be reconstructed to infer external stressors. In this article, we report on recent research presentations from a symposium at the 248th American Chemical Society meeting held in San Francisco, 10-14 August 2014, that focused on providing some insight into these important issues.
NASA Technical Reports Server (NTRS)
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
Bean, Heather D; Pleil, Joachim D; Hill, Jane E
2015-02-01
The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical and statistical perspectives. A second concern is how the environment interacts with human systems biology, what the variability is in "normal" subjects, and how such biological observations might be reconstructed to infer external stressors. In this article, we report on recent research presentations from a symposium at the 248th American Chemical Society meeting held in San Francisco, 10-14 August 2014, that focused on providing some insight into these important issues. PMID:25444302
Misuse of statistics in the interpretation of data on low-level radiation
Hamilton, L.D.
1982-01-01
Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.
Soil VisNIR chemometric performance statistics should be interpreted as random variables
NASA Astrophysics Data System (ADS)
Brown, David J.; Gasch, Caley K.; Poggio, Matteo; Morgan, Cristine L. S.
2015-04-01
Chemometric models are normally evaluated using performance statistics such as the Standard Error of Prediction (SEP) or the Root Mean Squared Error of Prediction (RMSEP). These statistics are used to evaluate the quality of chemometric models relative to other published work on a specific soil property or to compare the results from different processing and modeling techniques (e.g. Partial Least Squares Regression or PLSR and random forest algorithms). Claims are commonly made about the overall success of an application or the relative performance of different modeling approaches assuming that these performance statistics are fixed population parameters. While most researchers would acknowledge that small differences in performance statistics are not important, rarely are performance statistics treated as random variables. Given that we are usually comparing modeling approaches for general application, and given that the intent of VisNIR soil spectroscopy is to apply chemometric calibrations to larger populations than are included in our soil-spectral datasets, it is more appropriate to think of performance statistics as random variables with variation introduced through the selection of samples for inclusion in a given study and through the division of samples into calibration and validation sets (including spiking approaches). Here we look at the variation in VisNIR performance statistics for the following soil-spectra datasets: (1) a diverse US Soil Survey soil-spectral library with 3768 samples from all 50 states and 36 different countries; (2) 389 surface and subsoil samples taken from US Geological Survey continental transects; (3) the Texas Soil Spectral Library (TSSL) with 3000 samples; (4) intact soil core scans of Texas soils with 700 samples; (5) approximately 400 in situ scans from the Pacific Northwest region; and (6) miscellaneous local datasets. We find the variation in performance statistics to be surprisingly large. This has important
NASA Astrophysics Data System (ADS)
Jha, Sanjeev Kumar; Comunian, Alessandro; Mariethoz, Gregoire; Kelly, Bryce F. J.
2014-10-01
We develop a stochastic approach to construct channelized 3-D geological models constrained to borehole measurements as well as geological interpretation. The methodology is based on simple 2-D geologist-provided sketches of fluvial depositional elements, which are extruded in the 3rd dimension. Multiple-point geostatistics (MPS) is used to impair horizontal variability to the structures by introducing geometrical transformation parameters. The sketches provided by the geologist are used as elementary training images, whose statistical information is expanded through randomized transformations. We demonstrate the applicability of the approach by applying it to modeling a fluvial valley filling sequence in the Maules Creek catchment, Australia. The facies models are constrained to borehole logs, spatial information borrowed from an analogue and local orientations derived from the present-day stream networks. The connectivity in the 3-D facies models is evaluated using statistical measures and transport simulations. Comparison with a statistically equivalent variogram-based model shows that our approach is more suited for building 3-D facies models that contain structures specific to the channelized environment and which have a significant influence on the transport processes.
Shafieloo, Arman
2012-05-01
By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties.
Parameter Interpretation and Reduction for a Unified Statistical Mechanical Surface Tension Model.
Boyer, Hallie; Wexler, Anthony; Dutcher, Cari S
2015-09-01
Surface properties of aqueous solutions are important for environments as diverse as atmospheric aerosols and biocellular membranes. Previously, we developed a surface tension model for both electrolyte and nonelectrolyte aqueous solutions across the entire solute concentration range (Wexler and Dutcher, J. Phys. Chem. Lett. 2013, 4, 1723-1726). The model differentiated between adsorption of solute molecules in the bulk and surface of solution using the statistical mechanics of multilayer sorption solution model of Dutcher et al. (J. Phys. Chem. A 2013, 117, 3198-3213). The parameters in the model had physicochemical interpretations, but remained largely empirical. In the current work, these parameters are related to solute molecular properties in aqueous solutions. For nonelectrolytes, sorption tendencies suggest a strong relation with molecular size and functional group spacing. For electrolytes, surface adsorption of ions follows ion surface-bulk partitioning calculations by Pegram and Record (J. Phys. Chem. B 2007, 111, 5411-5417). PMID:26275040
Parameter Interpretation and Reduction for a Unified Statistical Mechanical Surface Tension Model.
Boyer, Hallie; Wexler, Anthony; Dutcher, Cari S
2015-09-01
Surface properties of aqueous solutions are important for environments as diverse as atmospheric aerosols and biocellular membranes. Previously, we developed a surface tension model for both electrolyte and nonelectrolyte aqueous solutions across the entire solute concentration range (Wexler and Dutcher, J. Phys. Chem. Lett. 2013, 4, 1723-1726). The model differentiated between adsorption of solute molecules in the bulk and surface of solution using the statistical mechanics of multilayer sorption solution model of Dutcher et al. (J. Phys. Chem. A 2013, 117, 3198-3213). The parameters in the model had physicochemical interpretations, but remained largely empirical. In the current work, these parameters are related to solute molecular properties in aqueous solutions. For nonelectrolytes, sorption tendencies suggest a strong relation with molecular size and functional group spacing. For electrolytes, surface adsorption of ions follows ion surface-bulk partitioning calculations by Pegram and Record (J. Phys. Chem. B 2007, 111, 5411-5417).
Barber, Chris; Cayley, Alex; Hanser, Thierry; Harding, Alex; Heghes, Crina; Vessey, Jonathan D; Werner, Stephane; Weiner, Sandy K; Wichard, Joerg; Giddings, Amanda; Glowienke, Susanne; Parenty, Alexis; Brigo, Alessandro; Spirkl, Hans-Peter; Amberg, Alexander; Kemper, Ray; Greene, Nigel
2016-04-01
The relative wealth of bacterial mutagenicity data available in the public literature means that in silico quantitative/qualitative structure activity relationship (QSAR) systems can readily be built for this endpoint. A good means of evaluating the performance of such systems is to use private unpublished data sets, which generally represent a more distinct chemical space than publicly available test sets and, as a result, provide a greater challenge to the model. However, raw performance metrics should not be the only factor considered when judging this type of software since expert interpretation of the results obtained may allow for further improvements in predictivity. Enough information should be provided by a QSAR to allow the user to make general, scientifically-based arguments in order to assess and overrule predictions when necessary. With all this in mind, we sought to validate the performance of the statistics-based in vitro bacterial mutagenicity prediction system Sarah Nexus (version 1.1) against private test data sets supplied by nine different pharmaceutical companies. The results of these evaluations were then analysed in order to identify findings presented by the model which would be useful for the user to take into consideration when interpreting the results and making their final decision about the mutagenic potential of a given compound. PMID:26708083
Design of easily testable systems
Rawat, S.S.
1988-01-01
This thesis presents structured testability techniques that can be applied to systolic arrays. Systolic arrays for signal processing have produced processing rates far in excess of general-purpose architecture. Fast testing is considered as one of the design criteria. The main goal is to derive test vectors for one- and two-dimensional systolic arrays. The author seeks to keep the number of test vectors independent of the size of the array under a generic fault model. The testable design is based on pseudo-exhaustive testing. Conventional testing uses Level Sensitive Scan Detection (LSSD) techniques which are very time consuming for an array of systolic processors. By making the testability analysis early the logic designer will be able to make early (and repeated) design trade-offs that make design for testability a simple extension of the design process. The author shows how one-dimensional sequential systolic arrays can be designed so that the faults can be easily detected and isolated. He also considers unilateral two-dimensional sequential arrays and suggests modifications to make them easily testable. Finally, he shows how a modified carry look ahead adder of arbitrary size can be tested with just 136 test vectors. Comparisons are made against the standard LSSD technique.
Easily Constructed Microscale Spectroelectrochemical Cell
Strickland, Jordan C.
2013-01-01
The design and performance of an easily constructed cell for microscale spectroelectrochemical analysis is described. A cation exchange polymer film, Nafion, was used as a salt bridge to provide ionic contact between a small sample well containing a coiled wire working electrode and separate, larger wells housing reference and auxiliary electrodes. The cell was evaluated using aqueous ferri/ferrocyanide as a test system and shown to be capable of relatively sensitive visible absorption measurements (path lengths on the order of millimeters) and reasonably rapid bulk electrolysis (~ 5 min) of samples in the 1 to 5 μL volume range. Minor alterations to the cell design are cited that could allow for analysis of sub-microliter volumes, rapid multi-sample analysis, and measurements in the ultraviolet spectral region. PMID:24058214
Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma
2015-11-01
When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies.
ERIC Educational Resources Information Center
Boysen, Guy A.
2015-01-01
Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…
Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan
2014-03-15
Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
NASA Technical Reports Server (NTRS)
Bhatia, A. K.; Underhill, A. B.
1986-01-01
The interpretation of the intensities of the hydrogen and helium emission lines in O and Wolf-Rayet spectra in terms of the abundance of hydrogen relative to helium requires information regarding the distribution of hydrogen and helium atoms and ions over their several energy states. In addition, some estimate is needed regarding the transmission of the radiation through the stellar mantle. The present paper provides new information concerning the population of the energy levels of hydrogen and helium when statistical equilibrium occurs in the presence of a radiation field. The results are applied to an interpretation of the spectra of four Wolf-Rayet stars, taking into account the implications for interpreting the spectra of O stars, OB supergiants, and Be stars.
Phoenix, S.L.; Wu, E.M.
1983-03-01
This paper presents some new data on the strength and stress-rupture of Kevlar-49 fibers, fiber/epoxy strands and pressure vessels, and consolidated data obtained at LLNL over the past 10 years. This data are interpreted by using recent theoretical results from a micromechanical model of the statistical failure process, thereby gaining understanding of the roles of the epoxy matrix and ultraviolet radiation on long term lifetime.
Boyle temperature as a point of ideal gas in gentile statistics and its economic interpretation
NASA Astrophysics Data System (ADS)
Maslov, V. P.; Maslova, T. V.
2014-07-01
Boyle temperature is interpreted as the temperature at which the formation of dimers becomes impossible. To Irving Fisher's correspondence principle we assign two more quantities: the number of degrees of freedom, and credit. We determine the danger level of the mass of money M when the mutual trust between economic agents begins to fall.
2014-01-01
Background A new algorithm has been developed to enable the interpretation of black box models. The developed algorithm is agnostic to learning algorithm and open to all structural based descriptors such as fragments, keys and hashed fingerprints. The algorithm has provided meaningful interpretation of Ames mutagenicity predictions from both random forest and support vector machine models built on a variety of structural fingerprints. A fragmentation algorithm is utilised to investigate the model’s behaviour on specific substructures present in the query. An output is formulated summarising causes of activation and deactivation. The algorithm is able to identify multiple causes of activation or deactivation in addition to identifying localised deactivations where the prediction for the query is active overall. No loss in performance is seen as there is no change in the prediction; the interpretation is produced directly on the model’s behaviour for the specific query. Results Models have been built using multiple learning algorithms including support vector machine and random forest. The models were built on public Ames mutagenicity data and a variety of fingerprint descriptors were used. These models produced a good performance in both internal and external validation with accuracies around 82%. The models were used to evaluate the interpretation algorithm. Interpretation was revealed that links closely with understood mechanisms for Ames mutagenicity. Conclusion This methodology allows for a greater utilisation of the predictions made by black box models and can expedite further study based on the output for a (quantitative) structure activity model. Additionally the algorithm could be utilised for chemical dataset investigation and knowledge extraction/human SAR development. PMID:24661325
ERIC Educational Resources Information Center
Lipsey, Mark W.; Puzio, Kelly; Yun, Cathy; Hebert, Michael A.; Steinka-Fry, Kasia; Cole, Mikel W.; Roberts, Megan; Anthony, Karen S.; Busick, Matthew D.
2012-01-01
This paper is directed to researchers who conduct and report education intervention studies. Its purpose is to stimulate and guide them to go a step beyond reporting the statistics that emerge from their analysis of the differences between experimental groups on the respective outcome variables. With what is often very minimal additional effort,…
Statistics Translated: A Step-by-Step Guide to Analyzing and Interpreting Data
ERIC Educational Resources Information Center
Terrell, Steven R.
2012-01-01
Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying…
Patton, Charles J.; Gilroy, Edward J.
1999-01-01
Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.
Eide, I; Zahlsen, K
1996-01-01
The paper describes experimental and statistical methods for toxicokinetic evaluation of mixtures in inhalation experiments. Synthetic mixtures of three C9 n-paraffinic, naphthenic and aromatic hydrocarbons (n-nonane, trimethylcyclohexane and trimethylbenzene, respectively) were studied in the rat after inhalation for 12h. The hydrocarbons were mixed according to principles for statistical experimental design using mixture design at four vapour levels (75, 150, 300 and 450 ppm) to support an empirical model with linear, interaction and quadratic terms (Taylor polynome). Immediately after exposure, concentrations of hydrocarbons were measured by head space gas chromatography in blood, brain, liver, kidneys and perirenal fat. Multivariate data analysis and modelling were performed with PLS (projections to latent structures). The best models were obtained after removing all interaction terms, suggesting that there were no interactions between the hydrocarbons with respect to absorption and distribution. Uptake of paraffins and particularly aromatics is best described by quadratic models, whereas the uptake of the naphthenic hydrocarbons is nearly linear. All models are good, with high correlation (r2) and prediction properties (Q2), the latter after cross validation. The concentrations of aromates in blood were high compared to the other hydrocarbons. At concentrations below 250 ppm, the naphthene reached higher concentrations in the brain compared to the paraffin and the aromate. Statistical experimental design, multivariate data analysis and modelling have proved useful for the evaluation of synthetic mixtures. The principles may also be used in the design of liquid mixtures, which may be evaporated partially or completely.
Tasker, Gary D.; Granato, Gregory E.
2000-01-01
Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques
Statistical analyses to support forensic interpretation for a new ten-locus STR profiling system.
Foreman, L A; Evett, I W
2001-01-01
A new ten-locus STR (short tandem repeat) profiling system was recently introduced into casework by the Forensic Science Service (FSS) and statistical analyses are described here based on data collected using this new system for the three major racial groups of the UK: Caucasian. Afro-Caribbean and Asian (of Indo-Pakistani descent). Allele distributions are compared and the FSS position with regard to routine significance testing of DNA frequency databases is discussed. An investigation of match probability calculations is carried out and the consequent analyses are shown to provide support for proposed changes in how the FSS reports DNA results when very small match probabilities are involved.
Rapp, J.B.
1991-01-01
Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
NASA Astrophysics Data System (ADS)
Sibatov, R. T.
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
NASA Astrophysics Data System (ADS)
Chung, Jung R.; DeLaughter, Aimee H.; Baba, Justin S.; Spiegelman, Clifford H.; Amoss, M. S.; Cote, Gerard L.
2003-07-01
The Mueller matrix describes all the polarizing properties of a sample, and therefore the optical differences between cancerous and non-cancerous tissue should be present within the matrix elements. We present in this paper the Mueller matrices of three types of tissue; normal, benign mole, and malignant melanoma on a Sinclair swine model. Feature extraction is done on the Mueller matrix elements resulting in the retardance images, diattenuation images, and depolarization images. These images are analyzed in an attempt to determine the important factors for the identification of cancerous lesions from their benign counterparts. In addition, the extracted features are analyzed using statistical processing to develop an accurate classification scheme and to identify the importance of each parameter in the determination of cancerous versus non-cancerous tissue.
Monroe, Scott; Cai, Li
2015-01-01
This research is concerned with two topics in assessing model fit for categorical data analysis. The first topic involves the application of a limited-information overall test, introduced in the item response theory literature, to structural equation modeling (SEM) of categorical outcome variables. Most popular SEM test statistics assess how well the model reproduces estimated polychoric correlations. In contrast, limited-information test statistics assess how well the underlying categorical data are reproduced. Here, the recently introduced C2 statistic of Cai and Monroe (2014) is applied. The second topic concerns how the root mean square error of approximation (RMSEA) fit index can be affected by the number of categories in the outcome variable. This relationship creates challenges for interpreting RMSEA. While the two topics initially appear unrelated, they may conveniently be studied in tandem since RMSEA is based on an overall test statistic, such as C2. The results are illustrated with an empirical application to data from a large-scale educational survey.
A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements
Keita Yoshioka; Pinan Dawkrajai; Analis A. Romero; Ding Zhu; A. D. Hill; Larry W. Lake
2007-01-15
With the recent development of temperature measurement systems, continuous temperature profiles can be obtained with high precision. Small temperature changes can be detected by modern temperature measuring instruments such as fiber optic distributed temperature sensor (DTS) in intelligent completions and will potentially aid the diagnosis of downhole flow conditions. In vertical wells, since elevational geothermal changes make the wellbore temperature sensitive to the amount and the type of fluids produced, temperature logs can be used successfully to diagnose the downhole flow conditions. However, geothermal temperature changes along the wellbore being small for horizontal wells, interpretations of a temperature log become difficult. The primary temperature differences for each phase (oil, water, and gas) are caused by frictional effects. Therefore, in developing a thermal model for horizontal wellbore, subtle temperature changes must be accounted for. In this project, we have rigorously derived governing equations for a producing horizontal wellbore and developed a prediction model of the temperature and pressure by coupling the wellbore and reservoir equations. Also, we applied Ramey's model (1962) to the build section and used an energy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases at varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section. With the prediction models developed, we present inversion studies of synthetic and field examples. These results are essential to identify water or gas entry, to guide flow control devices in intelligent completions, and to decide if reservoir stimulation is needed in particular horizontal sections. This study will complete and validate these inversion studies.
A COMPREHENSIVE STATISTICALLY-BASED METHOD TO INTERPRET REAL-TIME FLOWING MEASUREMENTS
Pinan Dawkrajai; Analis A. Romero; Keita Yoshioka; Ding Zhu; A.D. Hill; Larry W. Lake
2004-10-01
In this project, we are developing new methods for interpreting measurements in complex wells (horizontal, multilateral and multi-branching wells) to determine the profiles of oil, gas, and water entry. These methods are needed to take full advantage of ''smart'' well instrumentation, a technology that is rapidly evolving to provide the ability to continuously and permanently monitor downhole temperature, pressure, volumetric flow rate, and perhaps other fluid flow properties at many locations along a wellbore; and hence, to control and optimize well performance. In this first year, we have made considerable progress in the development of the forward model of temperature and pressure behavior in complex wells. In this period, we have progressed on three major parts of the forward problem of predicting the temperature and pressure behavior in complex wells. These three parts are the temperature and pressure behaviors in the reservoir near the wellbore, in the wellbore or laterals in the producing intervals, and in the build sections connecting the laterals, respectively. Many models exist to predict pressure behavior in reservoirs and wells, but these are almost always isothermal models. To predict temperature behavior we derived general mass, momentum, and energy balance equations for these parts of the complex well system. Analytical solutions for the reservoir and wellbore parts for certain special conditions show the magnitude of thermal effects that could occur. Our preliminary sensitivity analyses show that thermal effects caused by near-wellbore reservoir flow can cause temperature changes that are measurable with smart well technology. This is encouraging for the further development of the inverse model.
NASA Astrophysics Data System (ADS)
Bouzid, Mohamed; Sellaoui, Lotfi; Khalfaoui, Mohamed; Belmabrouk, Hafedh; Lamine, Abdelmottaleb Ben
2016-02-01
In this work, we studied the adsorption of ethanol on three types of activated carbon, namely parent Maxsorb III and two chemically modified activated carbons (H2-Maxsorb III and KOH-H2-Maxsorb III). This investigation has been conducted on the basis of the grand canonical formalism in statistical physics and on simplified assumptions. This led to three parameter equations describing the adsorption of ethanol onto the three types of activated carbon. There was a good correlation between experimental data and results obtained by the new proposed equation. The parameters characterizing the adsorption isotherm were the number of adsorbed molecules (s) per site n, the density of the receptor sites per unit mass of the adsorbent Nm, and the energetic parameter p1/2. They were estimated for the studied systems by a non linear least square regression. The results show that the ethanol molecules were adsorbed in perpendicular (or non parallel) position to the adsorbent surface. The magnitude of the calculated adsorption energies reveals that ethanol is physisorbed onto activated carbon. Both van der Waals and hydrogen interactions were involved in the adsorption process. The calculated values of the specific surface AS, proved that the three types of activated carbon have a highly microporous surface.
Tsitouridou, Roxani; Papazova, Petia; Simeonova, Pavlina; Simeonov, Vasil
2013-01-01
The size distribution of aerosol particles (PM0.015-PM18) in relation to their soluble inorganic species and total water soluble organic compounds (WSOC) was investigated at an urban site of Thessaloniki, Northern Greece. The sampling period was from February to July 2007. The determined compounds were compared with mass concentrations of the PM fractions for nano (N: 0.015 < Dp < 0.06), ultrafine (UFP: 0.015 < Dp < 0.125), fine (FP: 0.015 < Dp < 2.0) and coarse particles (CP: 2.0 < Dp < 8.0) in order to perform mass closure of the water soluble content for the respective fractions. Electrolytes were the dominant species in all fractions (24-27%), followed by WSOC (16-23%). The water soluble inorganic and organic content was found to account for 53% of the nanoparticle, 48% of the ultrafine particle, 45% of the fine particle and 44% of the coarse particle mass. Correlations between the analyzed species were performed and the effect of local and long-range transported emissions was examined by wind direction and backward air mass trajectories. Multivariate statistical analysis (cluster analysis and principal components analysis) of the collected data was performed in order to reveal the specific data structure. Possible sources of air pollution were identified and an attempt is made to find patterns of similarity between the different sized aerosols and the seasons of monitoring. It was proven that several major latent factors are responsible for the data structure despite the size of the aerosols - mineral (soil) dust, sea sprays, secondary emissions, combustion sources and industrial impact. The seasonal separation proved to be not very specific. PMID:24007436
Flexible magnetic planning boards are easily transported
NASA Technical Reports Server (NTRS)
1965-01-01
Easily transportable preprinted magnetic planning boards are made by coating thin sheet steel with clear plastic. Flexible magnetic boards used with paper charts are constructed from close mesh steel screen.
An Easily Constructed Trigonal Prism Model.
ERIC Educational Resources Information Center
Yamana, Shukichi
1984-01-01
A model of a trigonal prism which is useful for teaching stereochemistry (especially of the neodymium enneahydrate ion), can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)
NASA Astrophysics Data System (ADS)
Dralle, D.; Karst, N.; Thompson, S. E.
2015-12-01
Multiple competing theories suggest that power law behavior governs the observed first-order dynamics of streamflow recessions - the important process by which catchments dry-out via the stream network, altering the availability of surface water resources and in-stream habitat. Frequently modeled as: dq/dt = -aqb, recessions typically exhibit a high degree of variability, even within a single catchment, as revealed by significant shifts in the values of "a" and "b" across recession events. One potential source of this variability lies in underlying, hard-to-observe fluctuations in how catchment water storage is partitioned amongst distinct storage elements, each having different discharge behaviors. Testing this and competing hypotheses with widely available streamflow timeseries, however, has been hindered by a power law scaling artifact that obscures meaningful covariation between the recession parameters, "a" and "b". Here we briefly outline a technique that removes this artifact, revealing intriguing new patterns in the joint distribution of recession parameters. Using long-term flow data from catchments in Northern California, we explore temporal variations, and find that the "a" parameter varies strongly with catchment wetness. Then we explore how the "b" parameter changes with "a", and find that measures of its variation are maximized at intermediate "a" values. We propose an interpretation of this pattern based on statistical mechanics, meaning "b" can be viewed as an indicator of the catchment "microstate" - i.e. the partitioning of storage - and "a" as a measure of the catchment macrostate (i.e. the total storage). In statistical mechanics, entropy (i.e. microstate variance, that is the variance of "b") is maximized for intermediate values of extensive variables (i.e. wetness, "a"), as observed in the recession data. This interpretation of "a" and "b" was supported by model runs using a multiple-reservoir catchment toy model, and lends support to the
Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.
2009-01-01
In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation. PMID:19805409
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Electronic modules easily separated from heat sink
NASA Technical Reports Server (NTRS)
1965-01-01
Metal heat sink and electronic modules bonded to a thermal bridge can be easily cleaved for removal of the modules for replacement or repair. A thin film of grease between a fluorocarbon polymer film on the metal heat sink and an adhesive film on the modules acts as the cleavage plane.
ERIC Educational Resources Information Center
Barner, David; Snedeker, Jesse
2008-01-01
Four experiments investigated 4-year-olds' understanding of adjective-noun compositionality and their sensitivity to statistics when interpreting scalar adjectives. In Experiments 1 and 2, children selected "tall" and "short" items from 9 novel objects called "pimwits" (1-9 in. in height) or from this array plus 4 taller or shorter distractor…
Acquire CoOmmodities Easily Card
1998-05-29
Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for helpmore » with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.« less
Acquire CoOmmodities Easily Card
Soler, E. E.
1998-05-29
Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.
Quantum of area {Delta}A=8{pi}l{sub P}{sup 2} and a statistical interpretation of black hole entropy
Ropotenko, Kostiantyn
2010-08-15
In contrast to alternative values, the quantum of area {Delta}A=8{pi}l{sub P}{sup 2} does not follow from the usual statistical interpretation of black hole entropy; on the contrary, a statistical interpretation follows from it. This interpretation is based on the two concepts: nonadditivity of black hole entropy and Landau quantization. Using nonadditivity a microcanonical distribution for a black hole is found and it is shown that the statistical weight of a black hole should be proportional to its area. By analogy with conventional Landau quantization, it is shown that quantization of a black hole is nothing but the Landau quantization. The Landau levels of a black hole and their degeneracy are found. The degree of degeneracy is equal to the number of ways to distribute a patch of area 8{pi}l{sub P}{sup 2} over the horizon. Taking into account these results, it is argued that the black hole entropy should be of the form S{sub bh}=2{pi}{center_dot}{Delta}{Gamma}, where the number of microstates is {Delta}{Gamma}=A/8{pi}l{sub P}{sup 2}. The nature of the degrees of freedom responsible for black hole entropy is elucidated. The applications of the new interpretation are presented. The effect of noncommuting coordinates is discussed.
Easily retrievable objects among the NEO population
NASA Astrophysics Data System (ADS)
García Yárnoz, D.; Sanchez, J. P.; McInnes, C. R.
2013-08-01
Asteroids and comets are of strategic importance for science in an effort to understand the formation, evolution and composition of the Solar System. Near-Earth Objects (NEOs) are of particular interest because of their accessibility from Earth, but also because of their speculated wealth of material resources. The exploitation of these resources has long been discussed as a means to lower the cost of future space endeavours. In this paper, we consider the currently known NEO population and define a family of so-called Easily Retrievable Objects (EROs), objects that can be transported from accessible heliocentric orbits into the Earth's neighbourhood at affordable costs. The asteroid retrieval transfers are sought from the continuum of low energy transfers enabled by the dynamics of invariant manifolds; specifically, the retrieval transfers target planar, vertical Lyapunov and halo orbit families associated with the collinear equilibrium points of the Sun-Earth Circular Restricted Three Body problem. The judicious use of these dynamical features provides the best opportunity to find extremely low energy Earth transfers for asteroid material. A catalogue of asteroid retrieval candidates is then presented. Despite the highly incomplete census of very small asteroids, the ERO catalogue can already be populated with 12 different objects retrievable with less than 500 m/s of Δ v. Moreover, the approach proposed represents a robust search and ranking methodology for future retrieval candidates that can be automatically applied to the growing survey of NEOs.
ERIC Educational Resources Information Center
Cruce, Ty M.
2009-01-01
This methodological note illustrates how a commonly used calculation of the Delta-p statistic is inappropriate for categorical independent variables, and this note provides users of logistic regression with a revised calculation of the Delta-p statistic that is more meaningful when studying the differences in the predicted probability of an…
Asfahani, Jamal
2014-02-01
Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted.
Pomeau, Yves; Louët, Sabine
2016-06-01
During the StatPhys Conference on 20th July 2016 in Lyon, France, Yves Pomeau and Daan Frenkel will be awarded the most important prize in the field of Statistical Mechanics: the 2016 Boltzmann Medal, named after the Austrian physicist and philosopher Ludwig Boltzmann. The award recognises Pomeau's key contributions to the Statistical Physics of non-equilibrium phenomena in general. And, in particular, for developing our modern understanding of fluid mechanics, instabilities, pattern formation and chaos. He is recognised as an outstanding theorist bridging disciplines from applied mathematics to statistical physics with a profound impact on the neighbouring fields of turbulence and mechanics. In the article Sabine Louët interviews Pomeau, who is an Editor for the European Physical Journal Special Topics. He shares his views and tells how he experienced the rise of Statistical Mechanics in the past few decades. He also touches upon the need to provide funding to people who have the rare ability to discover new things and ideas, and not just those who are good at filling in grant application forms. PMID:27349556
Pomeau, Yves; Louët, Sabine
2016-06-01
During the StatPhys Conference on 20th July 2016 in Lyon, France, Yves Pomeau and Daan Frenkel will be awarded the most important prize in the field of Statistical Mechanics: the 2016 Boltzmann Medal, named after the Austrian physicist and philosopher Ludwig Boltzmann. The award recognises Pomeau's key contributions to the Statistical Physics of non-equilibrium phenomena in general. And, in particular, for developing our modern understanding of fluid mechanics, instabilities, pattern formation and chaos. He is recognised as an outstanding theorist bridging disciplines from applied mathematics to statistical physics with a profound impact on the neighbouring fields of turbulence and mechanics. In the article Sabine Louët interviews Pomeau, who is an Editor for the European Physical Journal Special Topics. He shares his views and tells how he experienced the rise of Statistical Mechanics in the past few decades. He also touches upon the need to provide funding to people who have the rare ability to discover new things and ideas, and not just those who are good at filling in grant application forms.
Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira
2014-08-01
This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences.
Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S; Vives-Rego, Josep
2010-01-01
We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting.
Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S.; Vives-Rego, Josep
2010-01-01
We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754
Nash, J. Thomas; Frishman, David
1983-01-01
Analytical results for 61 elements in 370 samples from the Ranger Mine area are reported. Most of the rocks come from drill core in the Ranger No. 1 and Ranger No. 3 deposits, but 20 samples are from unmineralized drill core more than 1 km from ore. Statistical tests show that the elements Mg, Fe, F, Be, Co, Li, Ni, Pb, Sc, Th, Ti, V, CI, As, Br, Au, Ce, Dy, La Sc, Eu, Tb, Yb, and Tb have positive association with uranium, and Si, Ca, Na, K, Sr, Ba, Ce, and Cs have negative association. For most lithologic subsets Mg, Fe, Li, Cr, Ni, Pb, V, Y, Sm, Sc, Eu, and Yb are significantly enriched in ore-bearing rocks, whereas Ca, Na, K, Sr, Ba, Mn, Ce, and Cs are significantly depleted. These results are consistent with petrographic observations on altered rocks. Lithogeochemistry can aid exploration, but for these rocks requires methods that are expensive and not amenable to routine use.
Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo
2009-07-21
Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean=35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets. PMID:19524319
NASA Astrophysics Data System (ADS)
Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo
2009-07-01
Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean = 35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets.
Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo
2009-07-21
Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean=35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets.
Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol
2015-09-01
A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors.
Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol
2015-09-01
A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. PMID:25261588
NASA Astrophysics Data System (ADS)
Alpert, P. A.; Knopf, D. A.
2014-12-01
Ice nucleation is the initial step in forming mixed-phase and cirrus clouds, and is well established as an important influence on global climate. Laboratory studies investigate at which cloud relevant conditions of temperature (T) and relative humidity (RH) ice nucleation occurs and as a result, numerous fundamentally different ice nucleation descriptions have been proposed for implementation in cloud and climate models. We introduce a new immersion freezing model based on first principles of statistics to simulate individual droplet freezing requiring only three experimental parameters, which are the total number of droplets, the uncertainty of applied surface area per droplet, and the heterogeneous ice nucleation rate coefficient, Jhet, as a function as a function of T and water activity (aw), where in equilibrium RH=aw. Previous studies reporting frozen fractions (f) or Jhet for a droplet population are described by our model for mineral, inorganic, organic, and biological ice nuclei and different techniques including cold stage, oil-immersion, continuous flow diffusion chamber, flow tube, cloud chamber, acoustic levitation and wind levitation experiments. Taking advantage of the physically based parameterization of Jhet by Knopf and Alpert (Faraday Discuss., 165, 513-534, 2013), our model can predict immersion freezing for the entire atmospherically relevant range of T, RH, particle surface area, and time scales, even for conditions unattainable in a laboratory setting. Lastly, we present a rigorous experimental uncertainty analysis using a Monte Carlo method of laboratory derived Jhet and f. These results imply that classical nucleation theory is universal for immersion freezing. In combination with a aw based description of Jhet, this approach allows for a physically based and computational little demanding implementation in climate and cloud models.
Dziurkowska, Ewelina; Wesolowski, Marek
2015-01-01
Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods.
Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets
ERIC Educational Resources Information Center
Kulp, Christopher W.; Sprechini, Gene D.
2016-01-01
A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…
Easily constructed mini-sextant demonstrates optical principles
NASA Astrophysics Data System (ADS)
Nenninger, Garet G.
2000-04-01
An easily constructed optical instrument for measuring the angle between the Sun and the horizon is described. The miniature sextant relies on multiple reflections to produce multiple images of the sun at fixed angles away from the true Sun.
Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy
2015-10-15
The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii
Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy
2015-10-15
The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii
Epoxy-coated containers easily opened by wire band
NASA Technical Reports Server (NTRS)
Mc Coy, J. W.
1966-01-01
Epoxy coating reduces punctures, abrasions, and contamination of synthetic cellular containers used for shipping and storing fragile goods and equipment. A wire band is wound around the closure joint, followed by the epoxy coating. The container can then be easily opened by pulling the wire through the epoxy around the joint.
[Easily implemented cognitive behaviour techniques in primary care (part 2)].
Ibáñez-Tarín, C; Manzanera-Escartí, R
2014-01-01
Cognitive behavioural therapy has shown to be very effective for treating the vast majority of mental health disorders. In this second part of the article, we continue commenting on those techniques that can be easily used in the Primary Care setting. PMID:24210520
Micromanipulation tool is easily adapted to many uses
NASA Technical Reports Server (NTRS)
Shlichta, P. J.
1967-01-01
A special micromanipulation tool equipped with a plunger mounted in a small tube can be easily adapted to such work operations as cutting, precision clamping, and spot welding of microscopic filaments or other parts. This tool is valuable where extreme steadiness of high magnification is required.
Self-sealing, easily purged quick-disconnect hose coupling
NASA Technical Reports Server (NTRS)
Leyerle, R. B.
1970-01-01
Coupling for pressurized hoses handles gas or liquid, is easily purged, and automatically seals the hose when disconnected. Volatile or toxic materials can be isolated before the connection is broken. This device may interest food processors and manufacturers of fluid delivery systems.
An easily assembled laboratory exercise in computed tomography
NASA Astrophysics Data System (ADS)
Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf
2011-09-01
In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near IR light of the photogate (880 nm) to scan objects hidden from the human eye. This experiment effectively conveys how an image is formed during a CT scan and highlights the important physical and imaging concepts behind CT such as electromagnetic radiation, the interaction of light and matter, artefacts and windowing. Like our setup, previous undergraduate level laboratory activities which teach the basics of CT have also utilized light sources rather than x-rays; however, they required a more extensive setup and used devices not always easily found in undergraduate laboratories. Our setup is easily implemented with equipment found in many teaching laboratories.
An assessment of approximating aspheres with more easily manufactured surfaces.
Howells, M R; Anspach, J; Bender, J
1998-05-01
In designing optical systems for synchrotron radiation, one is often led to conclude that optimal performance can be obtained from optical surfaces described by conic sections of revolution, usually paraboloids and ellipsoids. The resulting design can lead to prescriptions for three-dimensional optical surfaces that are difficult to fabricate accurately. Under some circumstances satisfactory system performance can be achieved through the use of more easily manufactured surfaces such as cylinders, cones, bent cones, toroids and elliptical cylinders. These surfaces often have the additional benefits of scalability to large aperture, lower surface roughness and improved surface figure accuracy. In this paper we explore some of the conditions under which these more easily manufactured surfaces can be utilized without sacrificing performance.
[Easily closed gun-barrel enterostomy. A new technique].
Belliard, R; Saric, J; Dost, C; Vergne, P; Perissat, J
1982-05-15
The availability of continuous low rate enteral and parenteral feeding has enlarged the indications of enterostomy, notably in patients with multiple operations. However, closing an enterostomy, which may be high up in the small bowel, rises technical problems and is not always without risk. In this study a new technique of gun-barrel enterostomy easily closed with automatic sutures and without reopening of the abdominal wall is presented.
The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...
ERIC Educational Resources Information Center
Harrison, Judith; Thompson, Bruce; Vannest, Kimberly J.
2009-01-01
This article reviews the literature on interventions targeting the academic performance of students with attention-deficit/hyperactivity disorder (ADHD) and does so within the context of the statistical significance testing controversy. Both the arguments for and against null hypothesis statistical significance tests are reviewed. Recent standards…
Between Teacher & Parent: Helping the Child Who Cries Easily
ERIC Educational Resources Information Center
Brodkin, Adele M.
2004-01-01
Parents need to remember that crying is the first method of communication for children younger than 5 or 6. It is their way of getting attention. While it isn't easy for new parents to interpret their baby's cries, most learn to distinguish the "I am hungry--feed me" cry from the "My tummy hurts" or the "I am just fussy and bored" cry. This…
An easily fabricated high performance ionic polymer based sensor network
NASA Astrophysics Data System (ADS)
Zhu, Zicai; Wang, Yanjie; Hu, Xiaopin; Sun, Xiaofei; Chang, Longfei; Lu, Pin
2016-08-01
Ionic polymer materials can generate an electrical potential from ion migration under an external force. For traditional ionic polymer metal composite sensors, the output voltage is very small (a few millivolts), and the fabrication process is complex and time-consuming. This letter presents an ionic polymer based network of pressure sensors which is easily and quickly constructed, and which can generate high voltage. A 3 × 3 sensor array was prepared by casting Nafion solution directly over copper wires. Under applied pressure, two different levels of voltage response were observed among the nine nodes in the array. For the group producing the higher level, peak voltages reached as high as 25 mV. Computational stress analysis revealed the physical origin of the different responses. High voltages resulting from the stress concentration and asymmetric structure can be further utilized to modify subsequent designs to improve the performance of similar sensors.
Modern Matrons: can they be easily identified by hospital patients?
Bufton, Sally
The Modern Matron was introduced into hospital Trusts in April 2002 to improve the basics of patient care. They were to be easily identifiable, highly visible and authoritative figures. This article reports on a quantitative study done to ascertain if patients can identify the Modern Matron in one acute NHS Trust. A researcher-developed questionnaire was sent to 20 Modern Matrons and a different questionnaire was distributed to 72 randomly selected patients. The results demonstrated that only 5% of patients surveyed were able to correctly identify the Modern Matron by their uniform. This may be explained by the response from the Modern Matrons when asked how much time was spent with patients; 67% of their normal working day was taken up with management of staff, paperwork and meetings, leaving very little direct patient time.
Design of easily testable and reconfigurable systolic arrays
Kim, J.H.
1987-01-01
Systolic arrays are considered to be preferred architectures for executing linear algebraic operations. In this thesis, easily testable and reconfigurable (ETAR) systolic arrays are studied to achieve the yield enhancement. New 2-D systolic arrays that lend themselves to easy reconfiguration as well as efficient implementations of algorithms are proposed. The 2-D bidirectional and unidirectional systolic arrays proposed are often better architectures than the rectangular and hexagonal systolic arrays proposed earlier, if one considers area, time and reconfigurability. Methods to design linear and 2-D ETAR systolic arrays are proposed. Procedures to design linear and 2-D unidirectional and bidirectional systolic arrays are given. The main feature of the proposed designs is that the COMUs of the PEs in the linear array can all be tested simultaneously. Another feature is that the throughputs of the reconfigured linear unidirectional as well as bidirectional arrays can remain to be equal to those of the fault-free linear arrays. A reconfiguration algorithm for 2-D systolic arrays is also proposed.
A highly versatile and easily configurable system for plant electrophysiology.
Gunsé, Benet; Poschenrieder, Charlotte; Rankl, Simone; Schröeder, Peter; Rodrigo-Moreno, Ana; Barceló, Juan
2016-01-01
In this study we present a highly versatile and easily configurable system for measuring plant electrophysiological parameters and ionic flow rates, connected to a computer-controlled highly accurate positioning device. The modular software used allows easy customizable configurations for the measurement of electrophysiological parameters. Both the operational tests and the experiments already performed have been fully successful and rendered a low noise and highly stable signal. Assembly, programming and configuration examples are discussed. The system is a powerful technique that not only gives precise measuring of plant electrophysiological status, but also allows easy development of ad hoc configurations that are not constrained to plant studies. •We developed a highly modular system for electrophysiology measurements that can be used either in organs or cells and performs either steady or dynamic intra- and extracellular measurements that takes advantage of the easiness of visual object-oriented programming.•High precision accuracy in data acquisition under electrical noisy environments that allows it to run even in a laboratory close to electrical equipment that produce electrical noise.•The system makes an improvement of the currently used systems for monitoring and controlling high precision measurements and micromanipulation systems providing an open and customizable environment for multiple experimental needs.
Triazolophthalazines: Easily Accessible Compounds with Potent Antitubercular Activity.
Veau, Damien; Krykun, Serhii; Mori, Giorgia; Orena, Beatrice S; Pasca, Maria R; Frongia, Céline; Lobjois, Valérie; Chassaing, Stefan; Lherbet, Christian; Baltas, Michel
2016-05-19
Tuberculosis (TB) remains one of the major causes of death worldwide, in particular because of the emergence of multidrug-resistant TB. Herein we explored the potential of an alternative class of molecules as anti-TB agents. Thus, a series of novel 3-substituted triazolophthalazines was quickly and easily prepared from commercial hydralazine hydrochloride as starting material and were further evaluated for their antimycobacterial activities and cytotoxicities. Four of the synthesized compounds were found to effectively inhibit the Mycobacterium tuberculosis (M.tb) H37 Rv strain with minimum inhibitory concentration (MIC) values <10 μg mL(-1) , whereas no compounds displayed cytotoxicity against HCT116 human cell lines (IC50 >100 μm). More remarkably, the most potent compounds proved to be active to a similar extent against various multidrug-resistant M.tb strains, thus uncovering a mode of action distinct from that of standard antitubercular agents. Overall, their ease of preparation, combined with their attractive antimycobacterial activities, make such triazolophthalazine-based derivatives promising leads for further development.
A highly versatile and easily configurable system for plant electrophysiology.
Gunsé, Benet; Poschenrieder, Charlotte; Rankl, Simone; Schröeder, Peter; Rodrigo-Moreno, Ana; Barceló, Juan
2016-01-01
In this study we present a highly versatile and easily configurable system for measuring plant electrophysiological parameters and ionic flow rates, connected to a computer-controlled highly accurate positioning device. The modular software used allows easy customizable configurations for the measurement of electrophysiological parameters. Both the operational tests and the experiments already performed have been fully successful and rendered a low noise and highly stable signal. Assembly, programming and configuration examples are discussed. The system is a powerful technique that not only gives precise measuring of plant electrophysiological status, but also allows easy development of ad hoc configurations that are not constrained to plant studies. •We developed a highly modular system for electrophysiology measurements that can be used either in organs or cells and performs either steady or dynamic intra- and extracellular measurements that takes advantage of the easiness of visual object-oriented programming.•High precision accuracy in data acquisition under electrical noisy environments that allows it to run even in a laboratory close to electrical equipment that produce electrical noise.•The system makes an improvement of the currently used systems for monitoring and controlling high precision measurements and micromanipulation systems providing an open and customizable environment for multiple experimental needs. PMID:27298766
Triazolophthalazines: Easily Accessible Compounds with Potent Antitubercular Activity.
Veau, Damien; Krykun, Serhii; Mori, Giorgia; Orena, Beatrice S; Pasca, Maria R; Frongia, Céline; Lobjois, Valérie; Chassaing, Stefan; Lherbet, Christian; Baltas, Michel
2016-05-19
Tuberculosis (TB) remains one of the major causes of death worldwide, in particular because of the emergence of multidrug-resistant TB. Herein we explored the potential of an alternative class of molecules as anti-TB agents. Thus, a series of novel 3-substituted triazolophthalazines was quickly and easily prepared from commercial hydralazine hydrochloride as starting material and were further evaluated for their antimycobacterial activities and cytotoxicities. Four of the synthesized compounds were found to effectively inhibit the Mycobacterium tuberculosis (M.tb) H37 Rv strain with minimum inhibitory concentration (MIC) values <10 μg mL(-1) , whereas no compounds displayed cytotoxicity against HCT116 human cell lines (IC50 >100 μm). More remarkably, the most potent compounds proved to be active to a similar extent against various multidrug-resistant M.tb strains, thus uncovering a mode of action distinct from that of standard antitubercular agents. Overall, their ease of preparation, combined with their attractive antimycobacterial activities, make such triazolophthalazine-based derivatives promising leads for further development. PMID:27097919
ERIC Educational Resources Information Center
BARGMANN, ROLF E.
THE STUDIES EMBODIED IN THIS REPORT PROPOSE SOME STATISTICAL METHODS OF ORDERING AND ATTAINING RELEVANCY TO HELP THE EDUCATIONAL RESEARCHER CHOOSE AMONG SUCH VARIABLES AS TESTS AND BEHAVIOR RATINGS. CONSTRUCTION OF A MODEL FOR THE ANALYSIS OF CONTINGENCY TABLES, DETERMINATION OF THE MOST APPROPRIATE ORDERING PRINCIPLE IN STEP-DOWN ANALYSIS FOR THE…
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Metview and VAPOR: Exploring ECMWF forecasts easily in four dimensions
NASA Astrophysics Data System (ADS)
Siemen, Stephan; Kertesz, Sandor; Carver, Glenn
2014-05-01
The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member states and co-operating states with forecasts in the medium time range of up to 15 days as well as other forcasts and analysis. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast. Users can choose to explore ECMWF's forecasts from the web or through visualisation tools installed locally or at ECMWF. ECMWF also develops in co-operation with INPE, Brazil, the Metview meteorological workstation and batch system. Metview enables users to easily analyse and visualise forecasts, and is routinely used by scientists and forecasters at ECMWF and other institutions. While Metview offers high quality visualisation in two-dimensional plots and animations, it uses external tools to visualise data in four dimensions. VAPOR is the Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers. VAPOR provides an interactive 3D visualisation environment that runs on most UNIX and Windows systems equipped with modern 3D graphics cards. VAPOR development is led by the National Center for Atmospheric Research's Scientific Computing Division in collaboration with U.C. Davis and Ohio State University. In this paper we will give an overview of how users, with Metview and access to ECMWF's archive, can visualise forecast data in four dimensions within VAPOR. The process of preparing the data in Metview is the key step and described in detail. The benefits to researchers are highlighted with a case study analysing a given weather scenario.
Langley, R G; Reich, K
2013-12-01
Psoriasis is a chronic disease requiring long-term therapy, which makes finding treatments with favourable long-term safety and efficacy profiles crucial. The goal of this review is to provide the background needed to evaluate properly long-term studies of biologic treatments for psoriasis. Firstly, important elements of design and analysis strategies are described. Secondly, data from published trials of biologic therapies for psoriasis are reviewed in light of the design and analysis choices implemented in the studies. Published reports of clinical trials of biologic treatments (adalimumab, alefacept, etanercept, infliximab or ustekinumab) that lasted 33 weeks or longer and included efficacy results and statistical analysis were reviewed. Study designs and statistical analyses were evaluated and summarized, emphasizing patient follow-up methods and handling of missing data. Various trial designs and data handling methods are used in long-term studies of biologic psoriasis treatments. Responder analyses in long-term trials can be conducted in responder enrichment, re-treated nonresponder or intent-to-treat trials. Missing data can be handled in four ways, including, from most to least conservative, nonresponder imputation, last-observation-carried-forward, as-observed analysis and anytime analysis. Long-term clinical trials have shown that adalimumab, alefacept, etanercept, infliximab and ustekinumab are efficacious for psoriasis treatment; however, without common standards for these trials, direct comparisons of these agents are difficult. Understanding differences in trial design and data handling is essential to make informed treatment decisions.
Langley, R G; Reich, K
2013-12-01
Psoriasis is a chronic disease requiring long-term therapy, which makes finding treatments with favourable long-term safety and efficacy profiles crucial. The goal of this review is to provide the background needed to evaluate properly long-term studies of biologic treatments for psoriasis. Firstly, important elements of design and analysis strategies are described. Secondly, data from published trials of biologic therapies for psoriasis are reviewed in light of the design and analysis choices implemented in the studies. Published reports of clinical trials of biologic treatments (adalimumab, alefacept, etanercept, infliximab or ustekinumab) that lasted 33 weeks or longer and included efficacy results and statistical analysis were reviewed. Study designs and statistical analyses were evaluated and summarized, emphasizing patient follow-up methods and handling of missing data. Various trial designs and data handling methods are used in long-term studies of biologic psoriasis treatments. Responder analyses in long-term trials can be conducted in responder enrichment, re-treated nonresponder or intent-to-treat trials. Missing data can be handled in four ways, including, from most to least conservative, nonresponder imputation, last-observation-carried-forward, as-observed analysis and anytime analysis. Long-term clinical trials have shown that adalimumab, alefacept, etanercept, infliximab and ustekinumab are efficacious for psoriasis treatment; however, without common standards for these trials, direct comparisons of these agents are difficult. Understanding differences in trial design and data handling is essential to make informed treatment decisions. PMID:23937204
Bogen, K; Hamilton, T F; Brown, T A; Martinelli, R E; Marchetti, A A; Kehl, S R; Langston, R G
2007-05-01
We have developed refined statistical and modeling techniques to assess low-level uptake and urinary excretion of plutonium from different population group in the northern Marshall Islands. Urinary excretion rates of plutonium from the resident population on Enewetak Atoll and from resettlement workers living on Rongelap Atoll range from <1 to 8 {micro}Bq per day and are well below action levels established under the latest Department regulation 10 CFR 835 in the United States for in vitro bioassay monitoring of {sup 239}Pu. However, our statistical analyses show that urinary excretion of plutonium-239 ({sup 239}Pu) from both cohort groups is significantly positively associated with volunteer age, especially for the resident population living on Enewetak Atoll. Urinary excretion of {sup 239}Pu from the Enewetak cohort was also found to be positively associated with estimates of cumulative exposure to worldwide fallout. Consequently, the age-related trends in urinary excretion of plutonium from Marshallese populations can be described by either a long-term component from residual systemic burdens acquired from previous exposures to worldwide fallout or a prompt (and eventual long-term) component acquired from low-level systemic intakes of plutonium associated with resettlement of the northern Marshall Islands, or some combination of both.
Making large amounts of meteorological plots easily accessible to users
NASA Astrophysics Data System (ADS)
Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin
2015-04-01
implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.
P value interpretations and considerations
Ronna, Brenden; Ott, Ulrike
2016-01-01
Application and interpretation of statistical evaluation of relationships is a necessary element in biomedical research. Statistical analyses rely on P value to demonstrate relationships. The traditional level of significance, P<0.05, can be negatively impacted by small sample size, bias, and random error, and has evolved to include interpretation of statistical trends, correction factors for multiple analyses, and acceptance of statistical significance for P>0.05 for complex relationships such as effect modification. PMID:27747028
Spirakis, C.S.; Pierson, C.T.; Santos, E.S.; Fishman, N.S.
1983-01-01
Statistical treatment of analytical data from 106 samples of uranium-mineralized and unmineralized or weakly mineralized rocks of the Morrison Formation from the northeastern part of the Church Rock area of the Grants uranium region indicates that along with uranium, the deposits in the northeast Church Rock area are enriched in barium, sulfur, sodium, vanadium and equivalent uranium. Selenium and molybdenum are sporadically enriched in the deposits and calcium, manganese, strontium, and yttrium are depleted. Unlike the primary deposits of the San Juan Basin, the deposits in the northeast part of the Church Rock area contain little organic carbon and several elements that are characteristically enriched in the primary deposits are not enriched or are enriched to a much lesser degree in the Church Rock deposits. The suite of elements associated with the deposits in the northeast part of the Church Rock area is also different from the suite of elements associated with the redistributed deposits in the Ambrosia Lake district. This suggests that the genesis of the Church Rock deposits is different, at least in part, from the genesis of the primary deposits of the San Juan Basin or the redistributed deposits at Ambrosia Lake.
NASA Astrophysics Data System (ADS)
Tema, E.; Zanella, E.; Pavón-Carrasco, F. J.; Kondopoulou, D.; Pavlides, S.
2015-10-01
We present the results of palaeomagnetic analysis on Late Bronge Age pottery from Santorini carried out in order to estimate the thermal effect of the Minoan eruption on the pre-Minoan habitation level. A total of 170 specimens from 108 ceramic fragments have been studied. The ceramics were collected from the surface of the pre-Minoan palaeosol at six different sites, including also samples from the Akrotiri archaeological site. The deposition temperatures of the first pyroclastic products have been estimated by the maximum overlap of the re-heating temperature intervals given by the individual fragments at site level. A new statistical elaboration of the temperature data has also been proposed, calculating at 95 per cent of probability the re-heating temperatures at each site. The obtained results show that the precursor tephra layer and the first pumice fall of the eruption were hot enough to re-heat the underlying ceramics at temperatures 160-230 °C in the non-inhabited sites while the temperatures recorded inside the Akrotiri village are slightly lower, varying from 130 to 200 °C. The decrease of the temperatures registered in the human settlements suggests that there was some interaction between the buildings and the pumice fallout deposits while probably the buildings debris layer caused by the preceding and syn-eruption earthquakes has also contributed to the decrease of the recorded re-heating temperatures.
Linda Stetzenbach; Lauren Nemnich; Davor Novosel
2009-08-31
Three independent tasks had been performed (Stetzenbach 2008, Stetzenbach 2008b, Stetzenbach 2009) to measure a variety of parameters in normative buildings across the United States. For each of these tasks 10 buildings were selected as normative indoor environments. Task 1 focused on office buildings, Task 13 focused on public schools, and Task 0606 focused on high performance buildings. To perform this task it was necessary to restructure the database for the Indoor Environmental Quality (IEQ) data and the Sound measurement as several issues were identified and resolved prior to and during the transfer of these data sets into SPSS. During overview discussions with the statistician utilized in this task it was determined that because the selection of indoor zones (1-6) was independently selected within each task; zones were not related by location across tasks. Therefore, no comparison would be valid across zones for the 30 buildings so the by location (zone) data were limited to three analysis sets of the buildings within each task. In addition, differences in collection procedures for lighting were used in Task 0606 as compared to Tasks 01 & 13 to improve sample collection. Therefore, these data sets could not be merged and compared so effects by-day data were run separately for Task 0606 and only Task 01 & 13 data were merged. Results of the statistical analysis of the IEQ parameters show statistically significant differences were found among days and zones for all tasks, although no differences were found by-day for Draft Rate data from Task 0606 (p>0.05). Thursday measurements of IEQ parameters were significantly different from Tuesday, and most Wednesday measures for all variables of Tasks 1 & 13. Data for all three days appeared to vary for Operative Temperature, whereas only Tuesday and Thursday differed for Draft Rate 1m. Although no Draft Rate measures within Task 0606 were found to significantly differ by-day, Temperature measurements for Tuesday and
NASA Astrophysics Data System (ADS)
Xu, C.; Shyu, J. B. H.; Xu, X.
2014-07-01
The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw= 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and the thicknesses of their erosion with topographic, geologic, and seismic parameters. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolution satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various environmental parameters. These parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several
NASA Astrophysics Data System (ADS)
Xu, C.; Shyu, J. B. H.; Xu, X.-W.
2014-02-01
The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and their erosion thicknesses with topographic factors, seismic parameters, and their distance from roads. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolutions satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various landslide controlling parameters. These controlling parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more
NASA Astrophysics Data System (ADS)
Msaddek, Mohamed Haythem; Moumni, Yahya; Chenini, Ismail; Mercier, Eric; Dlala, Mahmoud
2016-11-01
The quantitative analysis of fractures in carbonate rocks across termination folds is important for the understanding of the fractures network distribution and arrangement. In this study, we performed a quantitative analysis and interpretation of fracture network to identify the fracture networks type. For this reason, we used a multi-criteria statistical analysis. The distribution of directional families in all measured stations and their elemental distribution are firstly examined. Then we performed the analysis of directional criteria for each of the two and three neighbouring stations. Finally, the elemental analyses of fracture families crossing others were carried out. This methodology was applied to the folds of Jebal Chamsi and Jebal Belkhir areas located in south western Tunisia characterized by simple folds of carbonate geological formations. The application of the global and the elemental statistical analysis criteria of directional families show a random arrangement of fractures. However, elemental analysis of two and three neighbouring stations for families crossing one another shows a pseudo-organization of fracture arrangements.
ERIC Educational Resources Information Center
DeHaan, Frank, Ed.
1977-01-01
Describes an interpretative experiment involving the application of symmetry and temperature-dependent proton and fluorine nmr spectroscopy to the solution of structural and kinetic problems in coordination chemistry. (MLH)
ERIC Educational Resources Information Center
Pankhurst, Anne
1994-01-01
This paper examines some of the problems associated with interpreting metonymy, a figure of speech in which an attribute or commonly associated feature is used to name or designate something. After defining metonymy and outlining the principles of metonymy, the paper explains the differences between metonymy, synecdoche, and metaphor. It is…
CAinterprTools: An R package to help interpreting Correspondence Analysis' results
NASA Astrophysics Data System (ADS)
Alberti, Gianmarco
2015-09-01
Correspondence Analysis (CA) is a statistical exploratory technique frequently used in many research fields to graphically visualize the structure of contingency tables. Many programs, both commercial and free, perform CA but none of them as yet provides a visual aid to the interpretation of the results. The 'CAinterprTools' package, designed to be used in the free R statistical environment, aims at filling that gap. A novel-to-medium R user has been considered as target. 15 commands enable to easily obtain charts that help (and are relevant to) the interpretation of the CA's results, freeing the user from the need to inspect and scrutinize tabular CA outputs, and to look up values and statistics on which further calculations are necessary. The package also implements tests to assess the significance of the input table's total inertia and individual dimensions.
A flexible, interpretable framework for assessing sensitivity to unmeasured confounding.
Dorie, Vincent; Harada, Masataka; Carnegie, Nicole Bohme; Hill, Jennifer
2016-09-10
When estimating causal effects, unmeasured confounding and model misspecification are both potential sources of bias. We propose a method to simultaneously address both issues in the form of a semi-parametric sensitivity analysis. In particular, our approach incorporates Bayesian Additive Regression Trees into a two-parameter sensitivity analysis strategy that assesses sensitivity of posterior distributions of treatment effects to choices of sensitivity parameters. This results in an easily interpretable framework for testing for the impact of an unmeasured confounder that also limits the number of modeling assumptions. We evaluate our approach in a large-scale simulation setting and with high blood pressure data taken from the Third National Health and Nutrition Examination Survey. The model is implemented as open-source software, integrated into the treatSens package for the R statistical programming language. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:27139250
Reeve, Joanne
2010-01-01
Patient-centredness is a core value of general practice; it is defined as the interpersonal processes that support the holistic care of individuals. To date, efforts to demonstrate their relationship to patient outcomes have been disappointing, whilst some studies suggest values may be more rhetoric than reality. Contextual issues influence the quality of patient-centred consultations, impacting on outcomes. The legitimate use of knowledge, or evidence, is a defining aspect of modern practice, and has implications for patient-centredness. Based on a critical review of the literature, on my own empirical research, and on reflections from my clinical practice, I critique current models of the use of knowledge in supporting individualised care. Evidence-Based Medicine (EBM), and its implementation within health policy as Scientific Bureaucratic Medicine (SBM), define best evidence in terms of an epistemological emphasis on scientific knowledge over clinical experience. It provides objective knowledge of disease, including quantitative estimates of the certainty of that knowledge. Whilst arguably appropriate for secondary care, involving episodic care of selected populations referred in for specialist diagnosis and treatment of disease, application to general practice can be questioned given the complex, dynamic and uncertain nature of much of the illness that is treated. I propose that general practice is better described by a model of Interpretive Medicine (IM): the critical, thoughtful, professional use of an appropriate range of knowledges in the dynamic, shared exploration and interpretation of individual illness experience, in order to support the creative capacity of individuals in maintaining their daily lives. Whilst the generation of interpreted knowledge is an essential part of daily general practice, the profession does not have an adequate framework by which this activity can be externally judged to have been done well. Drawing on theory related to the
SLAR image interpretation keys for geographic analysis
NASA Technical Reports Server (NTRS)
Coiner, J. C.
1972-01-01
A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.
CT Colonography: Pitfalls in Interpretation
Pickhardt, Perry J.; Kim, David H.
2012-01-01
Synopsis As with any radiologic imaging test, there are a number of potential interpretive pitfalls at CT colonography (CTC) that need to be recognized and handled appropriately. Perhaps the single most important step in learning to avoid most of these diagnostic traps is simply to be aware of their existence. With a little experience, most of these potential pitfalls will be easily recognized. This review will systematically cover the key pitfalls confronting the radiologist at CTC interpretation, primarily dividing them into those related to technique and those related to underlying anatomy. Tips and pointers for how to effectively handle these potential pitfalls are included. PMID:23182508
LED champing: statistically blessed?
Wang, Zhuo
2015-06-10
LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions. PMID:26192863
ERIC Educational Resources Information Center
Rettich, Timothy R.; Battino, Rubin
1989-01-01
Presents a low cost system with easily replaced electrodes for use in general chemistry. Notes the accuracy and wide applicability permit easy use in physical or quantitative chemistry experiments. Provides schematic, theory, and helpful suggestions. (MVL)
How to limit clinical errors in interpretation of data.
Wright, P; Jansen, C; Wyatt, J C
1998-11-01
We all assume that we can understand and correctly interpret what we read. However, interpretation is a collection of subtle processes that are easily influenced by poor presentation or wording of information. This article examines how evidence-based principles of information design can be applied to medical records to enhance clinical understanding and accuracy in interpretation of the detailed data that they contain.
Invention Activities Support Statistical Reasoning
ERIC Educational Resources Information Center
Smith, Carmen Petrick; Kenlan, Kris
2016-01-01
Students' experiences with statistics and data analysis in middle school are often limited to little more than making and interpreting graphs. Although students may develop fluency in statistical procedures and vocabulary, they frequently lack the skills necessary to apply statistical reasoning in situations other than clear-cut textbook examples.…
ERIC Educational Resources Information Center
Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.
2012-01-01
The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET…
Cosmic statistics of statistics
NASA Astrophysics Data System (ADS)
Szapudi, István; Colombi, Stéphane; Bernardeau, Francis
1999-12-01
The errors on statistics measured in finite galaxy catalogues are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly non-linear to weakly non-linear scales. For non-linear functions of unbiased estimators, such as the cumulants, the phenomenon of cosmic bias is identified and computed. Since it is subdued by the cosmic errors in the range of applicability of the theory, correction for it is inconsequential. In addition, the method of Colombi, Szapudi & Szalay concerning sampling effects is generalized, adapting the theory for inhomogeneous galaxy catalogues. While previous work focused on the variance only, the present article calculates the cross-correlations between moments and connected moments as well for a statistically complete description. The final analytic formulae representing the full theory are explicit but somewhat complicated. Therefore we have made available a fortran program capable of calculating the described quantities numerically (for further details e-mail SC at colombi@iap.fr). An important special case is the evaluation of the errors on the two-point correlation function, for which this should be more accurate than any method put forward previously. This tool will be immensely useful in the future for assessing the precision of measurements from existing catalogues, as well as aiding the design of new galaxy surveys. To illustrate the applicability of the results and to explore the numerical aspects of the theory qualitatively and quantitatively, the errors and cross-correlations are predicted under a wide range of assumptions for the future Sloan Digital Sky Survey. The principal results concerning the cumulants ξ, Q3 and Q4 is that
ERIC Educational Resources Information Center
Scott, Leslie A.; Ingels, Steven J.
2007-01-01
The search for an understandable reporting format has led the National Assessment Governing Board to explore the possibility of measuring and interpreting student performance on the 12th-grade National Assessment of Educational Progress (NAEP), the Nation's Report Card, in terms of readiness for college, the workplace, and the military. This…
Statistical Reform in School Psychology Research: A Synthesis
ERIC Educational Resources Information Center
Swaminathan, Hariharan; Rogers, H. Jane
2007-01-01
Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.
An easily regenerable enzyme reactor prepared from polymerized high internal phase emulsions.
Ruan, Guihua; Wu, Zhenwei; Huang, Yipeng; Wei, Meiping; Su, Rihui; Du, Fuyou
2016-04-22
A large-scale high-efficient enzyme reactor based on polymerized high internal phase emulsion monolith (polyHIPE) was prepared. First, a porous cross-linked polyHIPE monolith was prepared by in-situ thermal polymerization of a high internal phase emulsion containing styrene, divinylbenzene and polyglutaraldehyde. The enzyme of TPCK-Trypsin was then immobilized on the monolithic polyHIPE. The performance of the resultant enzyme reactor was assessed according to the conversion ability of Nα-benzoyl-l-arginine ethyl ester to Nα-benzoyl-l-arginine, and the protein digestibility of bovine serum albumin (BSA) and cytochrome (Cyt-C). The results showed that the prepared enzyme reactor exhibited high enzyme immobilization efficiency and fast and easy-control protein digestibility. BSA and Cyt-C could be digested in 10 min with sequence coverage of 59% and 78%, respectively. The peptides and residual protein could be easily rinsed out from reactor and the reactor could be regenerated easily with 4 M HCl without any structure destruction. Properties of multiple interconnected chambers with good permeability, fast digestion facility and easily reproducibility indicated that the polyHIPE enzyme reactor was a good selector potentially applied in proteomics and catalysis areas.
An easily regenerable enzyme reactor prepared from polymerized high internal phase emulsions.
Ruan, Guihua; Wu, Zhenwei; Huang, Yipeng; Wei, Meiping; Su, Rihui; Du, Fuyou
2016-04-22
A large-scale high-efficient enzyme reactor based on polymerized high internal phase emulsion monolith (polyHIPE) was prepared. First, a porous cross-linked polyHIPE monolith was prepared by in-situ thermal polymerization of a high internal phase emulsion containing styrene, divinylbenzene and polyglutaraldehyde. The enzyme of TPCK-Trypsin was then immobilized on the monolithic polyHIPE. The performance of the resultant enzyme reactor was assessed according to the conversion ability of Nα-benzoyl-l-arginine ethyl ester to Nα-benzoyl-l-arginine, and the protein digestibility of bovine serum albumin (BSA) and cytochrome (Cyt-C). The results showed that the prepared enzyme reactor exhibited high enzyme immobilization efficiency and fast and easy-control protein digestibility. BSA and Cyt-C could be digested in 10 min with sequence coverage of 59% and 78%, respectively. The peptides and residual protein could be easily rinsed out from reactor and the reactor could be regenerated easily with 4 M HCl without any structure destruction. Properties of multiple interconnected chambers with good permeability, fast digestion facility and easily reproducibility indicated that the polyHIPE enzyme reactor was a good selector potentially applied in proteomics and catalysis areas. PMID:26995089
Summary and interpretive synthesis
1995-05-01
This chapter summarizes the major advances made through our integrated geological studies of the Lisburne Group in northern Alaska. The depositional history of the Lisburne Group is discussed in a framework of depositional sequence stratigraphy. Although individual parasequences (small-scale carbonate cycles) of the Wahoo Limestone cannot be correlated with certainty, parasequence sets can be interpreted as different systems tracts within the large-scale depositional sequences, providing insights on the paleoenvironments, paleogeography and platform geometry. Conodont biostratigraphy precisely established the position of the Mississippian-Pennsylvanian boundary within an important reference section, where established foraminiferal biostratigraphy is inconsistent with respect to conodont-based time-rock boundaries. However, existing Carboniferous conodont zonations are not readily applicable because most zonal indicators are absent, so a local zonation scheme was developed. Diagenetic studies of the Lisburne Group recognized nineteen subaerial exposure surfaces and developed a cement stratigraphy that includes: early cements associated with subaerial exposure surfaces in the Lisburne Group; cements associated with the sub-Permian unconformity; and later burial cements. Subaerial exposure surfaces in the Alapah Limestone are easily explained, being associated with peritidal environments at the boundaries of Sequence A. The Lisburne exposed in ANWR is generally tightly cemented and supermature, but could still be a good reservoir target in the adjacent subsurface of ANWR given the appropriate diagenetic, deformational and thermal history. Our ongoing research on the Lisburne Group will hopefully provide additional insights in future publications.
Institute of Paper Science Technology
2004-01-30
In recent years, the world has expressed an increasing interest in the recycling of waste paper to supplement the use of virgin fiber as a way to protect the environment. Statistics show that major countries are increasing their use of recycled paper. For example, in 1991 to 1996, the U.S. increased its recovered paper utilization rate from 31% to 39%, Germany went from 50% to 60%, the UK went from 60% to 70%, France increased from 46% to 49%, and China went from 32% to 35% [1]. As recycled fiber levels and water system closures both increase, recycled product quality will need to improve in order for recycled products to compete with products made from virgin fiber [2]. The use of recycled fiber has introduced an increasing level of metal, plastic, and adhesive contamination into the papermaking process which has added to the complexity of the already overwhelming task of providing a uniform and clean recycle furnish. The most harmful of these contaminates is a mixture of adhesives and polymeric substances that are commonly known as stickies. Stickies, which enter the mill with the pulp furnish, are not easily removed from the repulper and become more difficult the further down the system they get. This can be detrimental to the final product quality. Stickies are hydrophobic, tacky, polymeric materials that are introduced into the papermaking system from a mixture of recycled fiber sources. Properties of stickies are very similar to the fibers used in papermaking, viz. size, density, hydrophobicity, and electrokinetic charge. This reduces the probability of their removal by conventional separation processes, such as screening and cleaning, which are based on such properties. Also, their physical and chemical structure allows for them to extrude through screens, attach to fibers, process equipment, wires and felts. Stickies can break down and then reagglomerate and appear at seemingly any place in the mill. When subjected to a number of factors including changes
Interpreters, Interpreting, and the Study of Bilingualism.
ERIC Educational Resources Information Center
Valdes, Guadalupe; Angelelli, Claudia
2003-01-01
Discusses research on interpreting focused specifically on issues raised by this literature about the nature of bilingualism. Suggests research carried out on interpreting--while primarily produced with a professional audience in mind and concerned with improving the practice of interpreting--provides valuable insights about complex aspects of…
Alkaline-encrusted pyelitis and cystitis: an easily missed and life-threatening urinary infection
Lieten, Siddhartha; Schelfaut, Dan; Wissing, Karl Martin; Geers, Caroline; Tielemans, Christian
2011-01-01
Alkaline-encrusted pyelitis is a urinary infection characterised by encrustations in the wall of the urinary tract. It is caused by fastidious growing urea splitting microorganisms mainly Corynebacterium group D2. The diagnosis is easily missed and should be evoked on basis of sterile pyuria, alkaline urine pH and calcifications of the urinary excretion ways on the CT scan and then confirmed by prolonged culture on appropriate media. The authors report here the case of a patient who died after a delayed diagnosis from recurrent septic urinary infections. PMID:22700348
A Graphical Interpretation of Probit Coefficients.
ERIC Educational Resources Information Center
Becker, William E.; Waldman, Donald M.
1989-01-01
Contends that, when discrete choice models are taught, particularly the probit model, it is the method rather than the interpretation of the results that is emphasized. This article provides a graphical technique for interpretation of an estimated probit coefficient that will be useful in statistics and econometrics courses. (GG)
An AAA-DDD triply hydrogen-bonded complex easily accessible for supramolecular polymers.
Han, Yi-Fei; Chen, Wen-Qiang; Wang, Hong-Bo; Yuan, Ying-Xue; Wu, Na-Na; Song, Xiang-Zhi; Yang, Lan
2014-12-15
For a complementary hydrogen-bonded complex, when every hydrogen-bond acceptor is on one side and every hydrogen-bond donor is on the other, all secondary interactions are attractive and the complex is highly stable. AAA-DDD (A=acceptor, D=donor) is considered to be the most stable among triply hydrogen-bonded sequences. The easily synthesized and further derivatized AAA-DDD system is very desirable for hydrogen-bonded functional materials. In this case, AAA and DDD, starting from 4-methoxybenzaldehyde, were synthesized with the Hantzsch pyridine synthesis and Friedländer annulation reaction. The association constant determined by fluorescence titration in chloroform at room temperature is 2.09×10(7) M(-1) . The AAA and DDD components are not coplanar, but form a V shape in the solid state. Supramolecular polymers based on AAA-DDD triply hydrogen bonded have also been developed. This work may make AAA-DDD triply hydrogen-bonded sequences easily accessible for stimuli-responsive materials.
NASA Astrophysics Data System (ADS)
Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.
2012-02-01
The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET project team overcame this challenge by creating the Translation Utility. This tool allows a person fluent in both English and another language to easily translate any of the PhET simulations and requires minimal computer expertise. In this paper we discuss the technical issues involved in this software solution, as well as the issues involved in obtaining accurate translations. We share our solutions to many of the unexpected problems we encountered that would apply generally to making on-line scientific course materials available in many different languages, including working with: languages written right-to-left, different character sets, and different conventions for expressing equations, variables, units and scientific notation.
Targeting Lexicon in Interpreting.
ERIC Educational Resources Information Center
Farghal, Mohammed; Shakir, Abdullah
1994-01-01
Studies student interpreters in the Master's Translation Program at Yarmouk University in Jordan. Analyzes the difficulties of these students, particularly regarding lexical competence, when interpreting from Arabic to English, emphasizing the need to teach lexicon all through interpreting programs. (HB)
Motivating Play Using Statistical Reasoning
ERIC Educational Resources Information Center
Cross Francis, Dionne I.; Hudson, Rick A.; Lee, Mi Yeon; Rapacki, Lauren; Vesperman, Crystal Marie
2014-01-01
Statistical literacy is essential in everyone's personal lives as consumers, citizens, and professionals. To make informed life and professional decisions, students are required to read, understand, and interpret vast amounts of information, much of which is quantitative. To develop statistical literacy so students are able to make sense of…
SOCR: Statistics Online Computational Resource
ERIC Educational Resources Information Center
Dinov, Ivo D.
2006-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. PMID:26466186
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.
NASA Astrophysics Data System (ADS)
Ichihara, Yasuyo G.; Okabe, Masataka; Iga, Koichi; Tanaka, Yosuke; Musha, Kohei; Ito, Kei
2008-01-01
The objective of this project is to establish a practical application of the concept of Color Universal Design (CUD), the design that is recognizable to all color vision types. In our research, we looked for a clearly distinguishable combination of hues of four colors - black, red, green, and blue - which are frequently used in these circumstances. Red-green confusion people do not confuse all kinds of red and all kinds of green. By selecting particular hues for each color, the ability to distinguish between the four colors should be greatly improved. Our study thus concluded that, by carefully selecting hues within the range of each color category, it is possible to establish color-combinations which are easily distinguishable to people of all color-vision types in order to facilitate visual communication.
A 2D zinc-organic network being easily exfoliated into isolated sheets
NASA Astrophysics Data System (ADS)
Yu, Guihong; Li, Ruiqing; Leng, Zhihua; Gan, Shucai
2016-08-01
A metal-organic aggregate, namely {Zn2Cl2(BBC)}n (BBC = 4,4‧,4‧‧-(benzene-1,3,5-triyl-tris(benzene-4,1-diyl))tribenzoate) was obtained by solvothermal synthesis. Its structure is featured with the Zn2(COO)3 paddle-wheels with two chloride anions on axial positions and hexagonal pores in the layers. The exclusion of water in the precursor and the solvent plays a crucial role in the formation of target compound. This compound can be easily dissolved in alkaline solution and exfoliated into isolated sheets, which shows a novel way for the preparation of 2D materials.
Solving block linear systems with low-rank off-diagonal blocks is easily parallelizable
Menkov, V.
1996-12-31
An easily and efficiently parallelizable direct method is given for solving a block linear system Bx = y, where B = D + Q is the sum of a non-singular block diagonal matrix D and a matrix Q with low-rank blocks. This implicitly defines a new preconditioning method with an operation count close to the cost of calculating a matrix-vector product Qw for some w, plus at most twice the cost of calculating Qw for some w. When implemented on a parallel machine the processor utilization can be as good as that of those operations. Order estimates are given for the general case, and an implementation is compared to block SSOR preconditioning.
NASA Astrophysics Data System (ADS)
Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P.; Freitas, Vânia T.; André, Paulo S.; Carlos, Luis D.; Ferreira, Rute A. S.
2015-10-01
This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er3+, Yb3+ codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er3+- and Yb3+-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices.
The study on development of easily chewable and swallowable foods for elderly
Kim, Soojeong
2015-01-01
BACKGROUND/OBJECTS When the functions involved in the ingestion of food occurs failure, not only loss of enjoyment of eating, it will be faced with protein-energy malnutrition. Dysmasesis and difficulty of swallowing occurs in various diseases, but it may be a major cause of aging, and elderly people with authoring and dysmasesis and difficulty of swallowing in the aging society is expected to increase rapidly. SUBJECTS/METHODS In this study, we carried out a survey targeting nutritionists who work in elderly care facilities, and examined characteristics of offering of foods for elderly and the degree of demand of development of easily chewable and swallowable foods for the elderly who can crush foods and take that by their own tongues, and sometimes have difficulty in drinking water and tea. RESULTS In elderly care facilities, it was found to provide a finely chopped food or ground food that was ground with water in a blender for elderly with dysmasesis. Elderly satisfaction of provided foods is appeared overall low. Results of investigating the applicability of foods for elderly and the reflection will of menus, were showed the highest response rate in a gelification method in molecular gastronomic science technics, and results of investigating the frequent food of the elderly; representative menu of beef, pork, white fish, anchovies and spinach, were showed Korean barbecue beef, hot pepper paste stir fried pork, pan fried white fish, stir fried anchovy, seasoned spinach were the highest offer frequency. CONCLUSIONS This study will provide the fundamentals of the development of easily chewable and swallowable foods, gelification, for the elderly. The study will also illustrate that, in the elderly, food undergone gelification will reduce the risk of swallowing down to the wrong pipe and improve overall food preference. PMID:26244082
Smith, Alwyn
1969-01-01
This paper is based on an analysis of questionnaires sent to the health ministries of Member States of WHO asking for information about the extent, nature, and scope of morbidity statistical information. It is clear that most countries collect some statistics of morbidity and many countries collect extensive data. However, few countries relate their collection to the needs of health administrators for information, and many countries collect statistics principally for publication in annual volumes which may appear anything up to 3 years after the year to which they refer. The desiderata of morbidity statistics may be summarized as reliability, representativeness, and relevance to current health problems. PMID:5306722
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the statistical…
Easily regenerable solid adsorbents based on polyamines for carbon dioxide capture from the air.
Goeppert, Alain; Zhang, Hang; Czaun, Miklos; May, Robert B; Prakash, G K Surya; Olah, George A; Narayanan, S R
2014-05-01
Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle.
Binford, Greta J; Gillespie, Rosemary G; Maddison, Wayne P
2016-05-01
Spider venom composition typically differs between sexes. This pattern is anecdotally thought to reflect differences in adult feeding biology. We used a phylogenetic approach to compare intersexual venom dimorphism between species that differ in adult niche dimorphism. Male and female venoms were compared within and between related species of Hawaiian Tetragnatha, a mainland congener, and outgroups. In some species of Hawaiian Tetragnatha adult females spin orb-webs and adult males capture prey while wandering, while in other species both males and females capture prey by wandering. We predicted that, if venom sexual dimorphism is primarily explained by differences in adult feeding biology, species in which both sexes forage by wandering would have monomorphic venoms or venoms with reduced dimorphism relative to species with different adult feeding biology. However, we found striking sexual dimorphism in venoms of both wandering and orb-weaving Tetragnatha species with males having high molecular weight components in their venoms that were absent in females, and a reduced concentration of low molecular weight components relative to females. Intersexual differences in venom composition within Tetragnatha were significantly larger than in non-Tetragnatha species. Diet composition was not different between sexes. This striking venom dimorphism is not easily explained by differences in feeding ecology or behavior. Rather, we hypothesize that the dimorphism reflects male-specific components that play a role in mating biology possibly in sexual stimulation, nuptial gifts and/or mate recognition.
Kenjo, Y; Antoku, Y; Akazawa, K; Hanada, E; Kinukawa, N; Nose, Y
2000-05-01
In a randomized clinical trial, random allocation of patients to treatment groups should be done to balance in the distribution of prognostic factors. Random allocation in a multi-institutional randomized clinical trial is conducted by a coordinating center, independent of the medical institution the attending doctor uses for his/her practice. This study provides a sophisticated system for doing an exact random allocation of patients to treatment groups. The minimization method proposed by Pocock was applied to this system to balance the distribution of prognostic factors between two treatment groups, even when the number of registered patients is relatively small (S.J. Pocock, Allocation of patients to treatment in clinical trial, Biometrics 35 (1979) 183-197). Furthermore, Zelen's method is used to balance the number of patients allocated to the two groups within each institution (M. Zelen, The randomization and stratification of patients to clinical trials, J. Chron. Dis. 27 (1974) 365-375.). This system was created by the 'PERL&RSQUO; language for writing common gateway interface (CGI) script, and can therefore, be easily extended to include data entry function by attending doctors as well as the random allocation function. This system is being used effectively in thirteen multi-institutional randomized clinical trials for stomach, colon-rectum and breast cancers in Japan.
Easily regenerable solid adsorbents based on polyamines for carbon dioxide capture from the air.
Goeppert, Alain; Zhang, Hang; Czaun, Miklos; May, Robert B; Prakash, G K Surya; Olah, George A; Narayanan, S R
2014-05-01
Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle. PMID:24644023
Bar-Eyal, Leeat; Eisenberg, Ido; Faust, Adam; Raanan, Hagai; Nevo, Reinat; Rappaport, Fabrice; Krieger-Liszkay, Anja; Sétif, Pierre; Thurotte, Adrien; Reich, Ziv; Kaplan, Aaron; Ohad, Itzhak; Paltiel, Yossi; Keren, Nir
2015-10-01
Biological desert sand crusts are the foundation of desert ecosystems, stabilizing the sands and allowing colonization by higher order organisms. The first colonizers of the desert sands are cyanobacteria. Facing the harsh conditions of the desert, these organisms must withstand frequent desiccation-hydration cycles, combined with high light intensities. Here, we characterize structural and functional modifications to the photosynthetic apparatus that enable a cyanobacterium, Leptolyngbya sp., to thrive under these conditions. Using multiple in vivo spectroscopic and imaging techniques, we identified two complementary mechanisms for dissipating absorbed energy in the desiccated state. The first mechanism involves the reorganization of the phycobilisome antenna system, increasing excitonic coupling between antenna components. This provides better energy dissipation in the antenna rather than directed exciton transfer to the reaction center. The second mechanism is driven by constriction of the thylakoid lumen which limits diffusion of plastocyanin to P700. The accumulation of P700(+) not only prevents light-induced charge separation but also efficiently quenches excitation energy. These protection mechanisms employ existing components of the photosynthetic apparatus, forming two distinct functional modes. Small changes in the structure of the thylakoid membranes are sufficient for quenching of all absorbed energy in the desiccated state, protecting the photosynthetic apparatus from photoinhibitory damage. These changes can be easily reversed upon rehydration, returning the system to its high photosynthetic quantum efficiency.
Shaft seals with an easily removable cylinder holder for low-pressure steam turbines
NASA Astrophysics Data System (ADS)
Zakharov, A. E.; Rodionov, D. A.; Pimenov, E. V.; Sobolev, A. S.
2016-01-01
The article is devoted to the problems that occur at the operation of LPC shaft seals (SS) of turbines, particularly, their bearings. The problems arising from the deterioration of oil-protecting rings of SS and bearings and also the consequences in which they can result are considered. The existing SS housing construction types are considered. Their operational features are specified. A new SS construction type with an easily removable holder is presented. The construction of its main elements is described. The sequence of operations of the repair personnel at the restoration of the new SS type spacings is proposed. The comparative analysis of the new and the existing SS construction types is carried out. The assessment results of the efficiency, the operational convenience, and the economic effect after the installation of the new type seals are given. The conclusions about the offered construction prospects are made by results of the comparative analysis and the carried-out assessment. The main advantage of this design is the possibility of spacings restoration both in SS and in oil-protecting rings during a short-term stop of a turbine, even without its cooling. This construction was successfully tested on the working K-300-23.5 LMP turbine. However, its adaptation for other turbines is quite possible.
Zhang, Huai-Zhi; Zhang, Chang; Zeng, Guang-Ming; Gong, Ji-Lai; Ou, Xiao-Ming; Huan, Shuang-Yan
2016-06-01
Silver nanoparticle-decorated magnetic graphene oxide (MGO-Ag) was synthesized by doping silver and Fe3O4 nanoparticles on the surface of GO, which was used as an antibacterial agent. MGO-Ag was characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), Energy dispersive X-ray (EDS), X-ray diffraction (XRD), Raman spectroscopy and magnetic property tests. It can be found that magnetic iron oxide nanoparticles and nano-Ag was well dispersed on graphene oxide; and MGO-Ag exhibited excellent antibacterial activity against Escherichia coli and Staphylococcus aureus. Several factors were investigated to study the antibacterial effect of MGO-Ag, such as temperature, time, pH and bacterial concentration. We also found that MGO-Ag maintained high inactivation rates after use six times and can be separated easily after antibacterial process. Moreover, the antibacterial mechanism is discussed and the synergistic effect of GO, Fe3O4 nanoparticles and nano-Ag accounted for high inactivation of MGO-Ag. PMID:26994349
Easily Regenerable Solid Adsorbents Based on Polyamines for Carbon Dioxide Capture from the Air
Goeppert, A; Zhang, H; Czaun, M; May, RB; Prakash, GKS; Olah, GA; Narayanan, SR
2014-03-18
Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle.
Zhang, Huai-Zhi; Zhang, Chang; Zeng, Guang-Ming; Gong, Ji-Lai; Ou, Xiao-Ming; Huan, Shuang-Yan
2016-06-01
Silver nanoparticle-decorated magnetic graphene oxide (MGO-Ag) was synthesized by doping silver and Fe3O4 nanoparticles on the surface of GO, which was used as an antibacterial agent. MGO-Ag was characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), Energy dispersive X-ray (EDS), X-ray diffraction (XRD), Raman spectroscopy and magnetic property tests. It can be found that magnetic iron oxide nanoparticles and nano-Ag was well dispersed on graphene oxide; and MGO-Ag exhibited excellent antibacterial activity against Escherichia coli and Staphylococcus aureus. Several factors were investigated to study the antibacterial effect of MGO-Ag, such as temperature, time, pH and bacterial concentration. We also found that MGO-Ag maintained high inactivation rates after use six times and can be separated easily after antibacterial process. Moreover, the antibacterial mechanism is discussed and the synergistic effect of GO, Fe3O4 nanoparticles and nano-Ag accounted for high inactivation of MGO-Ag.
Open Window: When Easily Identifiable Genomes and Traits Are in the Public Domain
Angrist, Misha
2014-01-01
“One can't be of an enquiring and experimental nature, and still be very sensible.” - Charles Fort [1] As the costs of personal genetic testing “self-quantification” fall, publicly accessible databases housing people's genotypic and phenotypic information are gradually increasing in number and scope. The latest entrant is openSNP, which allows participants to upload their personal genetic/genomic and self-reported phenotypic data. I believe the emergence of such open repositories of human biological data is a natural reflection of inquisitive and digitally literate people's desires to make genomic and phenotypic information more easily available to a community beyond the research establishment. Such unfettered databases hold the promise of contributing mightily to science, science education and medicine. That said, in an age of increasingly widespread governmental and corporate surveillance, we would do well to be mindful that genomic DNA is uniquely identifying. Participants in open biological databases are engaged in a real-time experiment whose outcome is unknown. PMID:24647311
Wang, Congzhi; Peng, Xi; Liang, Dong; Xiao, Yang; Qiu, Weibao; Qian, Ming; Zheng, Hairong
2015-01-01
In ultrafast ultrasound imaging technique, how to maintain the high frame rate, and at the same time to improve the image quality as far as possible, has become a significant issue. Several novel beamforming methods based on compressive sensing (CS) theory have been proposed in previous literatures, but all have their own limitations, such as the excessively large memory consumption and the errors caused by the short-time discrete Fourier transform (STDFT). In this study, a novel CS-based time-domain beamformer for plane-wave ultrasound imaging is proposed and its image quality has been verified to be better than the traditional DAS method and even the popular coherent compounding method on several simulated phantoms. Comparing to the existing CS method, the memory consumption of our method is significantly reduced since the encoding matrix can be sparse-expressed. In addition, the time-delay calculations of the echo signals are directly accomplished in time-domain with a dictionary concept, avoiding the errors induced by the short-time Fourier translation calculation in those frequency-domain methods. The proposed method can be easily implemented on some low-cost hardware platforms, and can obtain ultrasound images with both high frame rate and good image quality, which make it has a great potential for clinical application.
Why can organic liquids move easily on smooth alkyl-terminated surfaces?
Urata, Chihiro; Masheder, Benjamin; Cheng, Dalton F; Miranda, Daniel F; Dunderdale, Gary J; Miyamae, Takayuki; Hozumi, Atsushi
2014-04-15
The dynamic dewettability of a smooth alkyl-terminated sol-gel hybrid film surface against 17 probe liquids (polar and nonpolar, with high and low surface tensions) was systematically investigated using contact angle (CA) hysteresis and substrate tilt angle (TA) measurements, in terms of their physicochemical properties such as surface tension, molecular weight/volume, dielectric constant, density, and viscosity. We found that the dynamic dewettability of the hybrid film markedly depended not on the surface tensions but on the dielectric constants of the probe liquids, displaying lower resistance to liquid drop movement with decreasing dielectric constant (ε < 30). Interfacial analysis using the sum-frequency generation (SFG) technique confirmed that the conformation of surface-tethered alkyl chains was markedly altered before and after contact with the different types of probe liquids. When probe liquids with low dielectric constants were in contact with our surface, CH3 groups were preferentially exposed at the solid/liquid interface, leading to a reduction in surface energy. Because of such local changes in surface energy at the three-phase contact line of the probe liquid, the contact line can move continuously from low-surface-energy (solid/liquid) areas to surrounding high-surface-energy (solid/air) areas without pinning. Consequently, the organic probe liquids with low dielectric constants can move easily and roll off when tilted only slightly, independent of the magnitude of CAs, without relying on conventional surface roughening and perfluorination.
The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.
... cancer statistics across the world. U.S. Cancer Mortality Trends The best indicator of progress against cancer is ... the number of cancer survivors has increased. These trends show that progress is being made against the ...
NASA Astrophysics Data System (ADS)
Hermann, Claudine
Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies - such as semiconductors or lasers - are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.
Perception in statistical graphics
NASA Astrophysics Data System (ADS)
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
Kokal, Idil; Engel, Annerose; Kirschner, Sebastian; Keysers, Christian
2011-01-01
Why does chanting, drumming or dancing together make people feel united? Here we investigate the neural mechanisms underlying interpersonal synchrony and its subsequent effects on prosocial behavior among synchronized individuals. We hypothesized that areas of the brain associated with the processing of reward would be active when individuals experience synchrony during drumming, and that these reward signals would increase prosocial behavior toward this synchronous drum partner. 18 female non-musicians were scanned with functional magnetic resonance imaging while they drummed a rhythm, in alternating blocks, with two different experimenters: one drumming in-synchrony and the other out-of-synchrony relative to the participant. In the last scanning part, which served as the experimental manipulation for the following prosocial behavioral test, one of the experimenters drummed with one half of the participants in-synchrony and with the other out-of-synchrony. After scanning, this experimenter “accidentally” dropped eight pencils, and the number of pencils collected by the participants was used as a measure of prosocial commitment. Results revealed that participants who mastered the novel rhythm easily before scanning showed increased activity in the caudate during synchronous drumming. The same area also responded to monetary reward in a localizer task with the same participants. The activity in the caudate during experiencing synchronous drumming also predicted the number of pencils the participants later collected to help the synchronous experimenter of the manipulation run. In addition, participants collected more pencils to help the experimenter when she had drummed in-synchrony than out-of-synchrony during the manipulation run. By showing an overlap in activated areas during synchronized drumming and monetary reward, our findings suggest that interpersonal synchrony is related to the brain's reward system. PMID:22110623
An easily-automated assay for the physiological state quantification of Pseudomonas sp. M18.
Dong, Dexian; Zhou, Kejun; Zhou, Quan; Huang, Xianqing; Xu, Yuquan; Li, Rongxiu
2008-12-01
In order to foreknow poorly performing cultures before wasting energy to scale them to large cultures, industrial microbial fermentation can greatly benefit from knowledge of the physiological state of cells. The method currently proposed is an easily automated physiological state determination method. We have designed one universal rRNA-specific probe for bacteria and developed novel signal probe hybridization (SPH) assay featuring no RNA extraction and no PCR amplification steps necessary to quantify the physiological state of microbial cells. The microbial cell was lysed with sonication and SDS. Signal probes were applied to hybridize and protect the rRNA target. S1 nuclease was then applied to remove the excessive signal probes, the single-stranded RNA and the mismatch RNA/DNA hybrids. The remaining signal probe was captured with a corresponding capture probe immobilized on a microplate and quantified with a horseradish peroxidase-conjugated color reaction. We then systemically optimized our assay. Results showed that the cell limit of detection (LOD) and the cell limit of quantification (LOQ) were 2.64 x 10(4) cells and 9.86 x 10(4) cells per well of microplate, respectively. The limit of detection (LOD) and the limit of quantification (LOQ) of signal probe were 49.0 fM and 344.0 fM respectively. Using this technique, we quantified the 16S rRNA levels during the fermentation process of Pseudomonas sp. M18. Our results indicate that the 16S rRNA levels can directly inform us about the physiological state of microbial cells. This technique has great potential for application to the microbial fermentation industry.
Farcau, Cosmin; Potara, Monica; Leordean, Cosmin; Boca, Sanda; Astilean, Simion
2013-01-21
The ability to easily prepare Surface Enhanced Raman Scattering (SERS) substrates by the assembly of chemically synthesized gold nanocolloids is of great interest for the advancement of SERS-based optical detection and identification of molecular species of biological or chemical interest, pollutants or warfare agents. In this work we employ three very simple strategies, which can be implemented in any laboratory without the need for specialized equipment, to prepare assemblies of citrate-stabilized spherical gold colloids: (i) drop-coating, which induces the assembly of colloids in so-called coffee rings; (ii) a simplified variant of convective self-assembly (CSA), based on water evaporation in a constrained geometry, which yields highly uniform strips of nanoparticles (NP); (iii) assembly onto chemically functionalized glass surfaces which yields randomly assembled colloids and colloidal clusters. The SERS properties of the resulting colloidal assemblies are comparatively evaluated under multiple excitation lines with p-aminothiophenol (pATP) as a model Raman scatterer. The NP strips obtained by CSA prove to be SERS-active both in the visible and NIR and possess a highly uniform SERS response as demonstrated by spectra at individually selected sites and by confocal SERS mapping. Further it is shown that these NP strips are effective for the detection of cytosine, a DNA component, and for multi-analyte SERS detection. These results, showing how an efficient SERS substrate can be obtained by a very simple assembly method from easy-to-synthesize colloidal gold NP, can have an impact on the development of analytical SERS applications. PMID:23171872
Hijnen, W A M; Biraud, D; Cornelissen, E R; van der Kooij, D
2009-07-01
One of the major impediments in the application of spiral-wound membranes in water treatment or desalination is clogging of the feed channel by biofouling which is induced by nutrients in the feedwater. Organic carbon is, under most conditions, limiting the microbial growth. The objective of this study is to assess the relationship between the concentration of an easily assimilable organic compound such as acetate in the feedwater and the pressure drop increase in the feed channel. For this purpose the membrane fouling simulator (MFS) was used as a model for the feed channel of a spiral-wound membrane. This MFS unit was supplied with drinking water enriched with acetate at concentrations ranging from 1 to 1000 microg C x L(-1). The pressure drop (PD) in the feed channel increased at all tested concentrations but not with the blank. The PD increase could be described by a first order process based on theoretical considerations concerning biofilm formation rate and porosity decline. The relationship between the first order fouling rate constant R(f) and the acetate concentration is described with a saturation function corresponding with the growth kinetics of bacteria. Under the applied conditions the maximum R(f) (0.555 d(-1)) was reached at 25 microg acetate-C x L(-1) and the half saturation constant k(f) was estimated at 15 microg acetate-C x L(-1). This value is higher than k(s) values for suspended bacteria grown on acetate, which is attributed to substrate limited growth conditions in the biofilm. The threshold concentration for biofouling of the feed channel is about 1 microg acetate-C x L(-1).
Denion, Eric; Lux, Anne-Laure; Mouriaux, Frédéric; Béraud, Guillaume
2016-01-01
Introduction We aimed to determine the limbal lighting illuminance thresholds (LLITs) required to trigger perception of sclerotic scatter at the opposite non-illuminated limbus (i.e. perception of a light limbal scleral arc) under different levels of ambient lighting illuminance (ALI). Material and Methods Twenty healthy volunteers were enrolled. The iris shade (light or dark) was graded by retrieving the median value of the pixels of a pre-determined zone of a gray-level iris photograph. Mean keratometry and central corneal pachymetry were recorded. Each subject was asked to lie down, and the ALI at eye level was set to mesopic values (10, 20, 40 lux), then photopic values (60, 80, 100, 150, 200 lux). For each ALI level, a light beam of gradually increasing illuminance was applied to the right temporal limbus until the LLIT was reached, i.e. the level required to produce the faint light arc that is characteristic of sclerotic scatter at the nasal limbus. Results After log-log transformation, a linear relationship between the logarithm of ALI and the logarithm of the LLIT was found (p<0.001), a 10% increase in ALI being associated with an average increase in the LLIT of 28.9%. Higher keratometry values were associated with higher LLIT values (p = 0.008) under low ALI levels, but the coefficient of the interaction was very small, representing a very limited effect. Iris shade and central corneal thickness values were not significantly associated with the LLIT. We also developed a censored linear model for ALI values ≤ 40 lux, showing a linear relationship between ALI and the LLIT, in which the LLIT value was 34.4 times greater than the ALI value. Conclusion Sclerotic scatter is more easily elicited under mesopic conditions than under photopic conditions and requires the LLIT value to be much higher than the ALI value, i.e. it requires extreme contrast. PMID:26964096
Basic statistics in cell biology.
Vaux, David L
2014-01-01
The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind.
Comprehensive Interpretive Planning.
ERIC Educational Resources Information Center
Kohen, Richard; Sikoryak, Kim
1999-01-01
Discusses interpretive planning and provides information on how to maximize a sense of ownership shared by managers, staff, and other organizational shareholders. Presents practical and effective plans for providing interpretive services. (CCM)
Journalists as Interpretive Communities.
ERIC Educational Resources Information Center
Zelizer, Barbie
1993-01-01
Proposes viewing journalists as members of an interpretive community (not a profession) united by its shared discourse and collective interpretations of key public events. Applies the frame of the interpretive community to journalistic discourse about two events central for American journalists--Watergate and McCarthyism. (SR)
A Note on a Geometric Interpretation of the Correlation Coefficient.
ERIC Educational Resources Information Center
Marks, Edmond
1982-01-01
An alternate geometric interpretation of the correlation coefficient to that given in most statistics texts for psychology and education is presented. This interpretation is considered to be more consistent with the statistical model for the data, and richer in geometric meaning. (Author)
Enhancing Table Interpretation Skills via Training in Table Creation
ERIC Educational Resources Information Center
Karazsia, Bryan T.
2013-01-01
Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents a new technique for enhancing student interpretation of American Psychological…
On statistical aspects of Qjets
NASA Astrophysics Data System (ADS)
Ellis, Stephen D.; Hornig, Andrew; Krohn, David; Roy, Tuhin S.
2015-01-01
The process by which jet algorithms construct jets and subjets is inherently ambiguous and equally well motivated algorithms often return very different answers. The Qjets procedure was introduced by the authors to account for this ambiguity by considering many reconstructions of a jet at once, allowing one to assign a weight to each interpretation of the jet. Employing these weighted interpretations leads to an improvement in the statistical stability of many measurements. Here we explore in detail the statistical properties of these sets of weighted measurements and demonstrate how they can be used to improve the reach of jet-based studies.
Interpreting Abstract Interpretations in Membership Equational Logic
NASA Technical Reports Server (NTRS)
Fischer, Bernd; Rosu, Grigore
2001-01-01
We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.
Parametric trial-by-trial prediction of pain by easily available physiological measures.
Geuter, Stephan; Gamer, Matthias; Onat, Selim; Büchel, Christian
2014-05-01
Pain is commonly assessed by subjective reports on rating scales. However, in many experimental and clinical settings, an additional, objective indicator of pain is desirable. In order to identify an objective, parametric signature of pain intensity that is predictive at the individual stimulus level across subjects, we recorded skin conductance and pupil diameter responses to heat pain stimuli of different durations and temperatures in 34 healthy subjects. The temporal profiles of trial-wise physiological responses were characterized by component scores obtained from principal component analysis. These component scores were then used as predictors in a linear regression analysis, resulting in accurate pain predictions for individual trials. Using the temporal information encoded in the principal component scores explained the data better than prediction by a single summary statistic (i.e., maximum amplitude). These results indicate that perceived pain is best reflected by the temporal dynamics of autonomic responses. Application of the regression model to an independent data set of 20 subjects resulted in a very good prediction of the pain ratings demonstrating the generalizability of the identified temporal pattern. Utilizing the readily available temporal information from skin conductance and pupil diameter responses thus allows parametric prediction of pain in human subjects.
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.
ERIC Educational Resources Information Center
Christensen, Timothy J.; Labov, Jay B.
1997-01-01
Details the construction of a viewing chamber for fruit flies that connects to a dissecting microscope and features a design that enables students to easily move fruit flies in and out of the chamber. (DDR)
Fluorescence lifetimes: fundamentals and interpretations.
Noomnarm, Ulai; Clegg, Robert M
2009-01-01
Fluorescence measurements have been an established mainstay of photosynthesis experiments for many decades. Because in the photosynthesis literature the basics of excited states and their fates are not usually described, we have presented here an easily understandable text for biology students in the style of a chapter in a text book. In this review we give an educational overview of fundamental physical principles of fluorescence, with emphasis on the temporal response of emission. Escape from the excited state of a molecule is a dynamic event, and the fluorescence emission is in direct kinetic competition with several other pathways of de-excitation. It is essentially through a kinetic competition between all the pathways of de-excitation that we gain information about the fluorescent sample on the molecular scale. A simple probability allegory is presented that illustrates the basic ideas that are important for understanding and interpreting most fluorescence experiments. We also briefly point out challenges that confront the experimenter when interpreting time-resolved fluorescence responses.
Misuse of statistics in surgical literature
Ronna, Brenden; Robbins, Riann B.
2016-01-01
Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes. PMID:27621909
Misuse of statistics in surgical literature.
Thiese, Matthew S; Ronna, Brenden; Robbins, Riann B
2016-08-01
Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes.
Misuse of statistics in surgical literature.
Thiese, Matthew S; Ronna, Brenden; Robbins, Riann B
2016-08-01
Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes. PMID:27621909
Misuse of statistics in surgical literature
Ronna, Brenden; Robbins, Riann B.
2016-01-01
Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes.
Interpretation biases in paranoia.
Savulich, George; Freeman, Daniel; Shergill, Sukhi; Yiend, Jenny
2015-01-01
Information in the environment is frequently ambiguous in meaning. Emotional ambiguity, such as the stare of a stranger, or the scream of a child, encompasses possible good or bad emotional consequences. Those with elevated vulnerability to affective disorders tend to interpret such material more negatively than those without, a phenomenon known as "negative interpretation bias." In this study we examined the relationship between vulnerability to psychosis, measured by trait paranoia, and interpretation bias. One set of material permitted broadly positive/negative (valenced) interpretations, while another allowed more or less paranoid interpretations, allowing us to also investigate the content specificity of interpretation biases associated with paranoia. Regression analyses (n=70) revealed that trait paranoia, trait anxiety, and cognitive inflexibility predicted paranoid interpretation bias, whereas trait anxiety and cognitive inflexibility predicted negative interpretation bias. In a group comparison those with high levels of trait paranoia were negatively biased in their interpretations of ambiguous information relative to those with low trait paranoia, and this effect was most pronounced for material directly related to paranoid concerns. Together these data suggest that a negative interpretation bias occurs in those with elevated vulnerability to paranoia, and that this bias may be strongest for material matching paranoid beliefs. We conclude that content-specific biases may be important in the cause and maintenance of paranoid symptoms.
[Blood proteins in African trypanosomiasis: variations and statistical interpretations].
Cailliez, M; Poupin, F; Pages, J P; Savel, J
1982-01-01
The estimation of blood orosomucoid, haptoglobin, C-reactive protein and immunoglobulins levels, has enable us to prove a specific proteic profile in the human african trypanosomiasis, as compared with other that of parasitic diseases, and with an healthy african reference group. Data processing informatique by principal components analysis, provide a valuable pool for epidemiological surveys.
Interpretation of psychophysics response curves using statistical physics.
Knani, S; Khalfaoui, M; Hachicha, M A; Mathlouthi, M; Ben Lamine, A
2014-05-15
Experimental gustatory curves have been fitted for four sugars (sucrose, fructose, glucose and maltitol), using a double layer adsorption model. Three parameters of the model are fitted, namely the number of molecules per site n, the maximum response RM and the concentration at half saturation C1/2. The behaviours of these parameters are discussed in relationship to each molecule's characteristics. Starting from the double layer adsorption model, we determined (in addition) the adsorption energy of each molecule on taste receptor sites. The use of the threshold expression allowed us to gain information about the adsorption occupation rate of a receptor site which fires a minimal response at a gustatory nerve. Finally, by means of this model we could calculate the configurational entropy of the adsorption system, which can describe the order and disorder of the adsorbent surface.
The Statistical Literacy Needed to Interpret School Assessment Data
ERIC Educational Resources Information Center
Chick, Helen; Pierce, Robyn
2013-01-01
State-wide and national testing in areas such as literacy and numeracy produces reports containing graphs and tables illustrating school and individual performance. These are intended to inform teachers, principals, and education organisations about student and school outcomes, to guide change and improvement. Given the complexity of the…
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Fit Indices Versus Test Statistics
ERIC Educational Resources Information Center
Yuan, Ke-Hai
2005-01-01
Model evaluation is one of the most important aspects of structural equation modeling (SEM). Many model fit indices have been developed. It is not an exaggeration to say that nearly every publication using the SEM methodology has reported at least one fit index. Most fit indices are defined through test statistics. Studies and interpretation of…
1986-01-01
Official population data for the USSR are presented for 1985 and 1986. Part 1 (pp. 65-72) contains data on capitals of union republics and cities with over one million inhabitants, including population estimates for 1986 and vital statistics for 1985. Part 2 (p. 72) presents population estimates by sex and union republic, 1986. Part 3 (pp. 73-6) presents data on population growth, including birth, death, and natural increase rates, 1984-1985; seasonal distribution of births and deaths; birth order; age-specific birth rates in urban and rural areas and by union republic; marriages; age at marriage; and divorces. PMID:12178831
Sign Interpretation in Preschool.
ERIC Educational Resources Information Center
Luetke-Stahlman, Barbara
1991-01-01
A special set of skills is essential for interpreting for mainstreamed deaf preschool students. Eleven issues in clarifying the job of the preschool interpreter are discussed, such as whether hearing children should learn to sign and how to encourage communication among hearing and deaf children. (JDD)
Higher Education Interpreting.
ERIC Educational Resources Information Center
Woll, Bencie; Porcari li Destri, Giulia
This paper discusses issues related to the training and provision of interpreters for deaf students at institutions of higher education in the United Kingdom. Background information provided notes the increasing numbers of deaf and partially hearing students, the existence of funding to pay for interpreters, and trends in the availability of…
ERIC Educational Resources Information Center
Erekson, James A.
2010-01-01
Prosody is a means for "reading with expression" and is one aspect of oral reading competence. This theoretical inquiry asserts that prosody is central to interpreting text, and draws distinctions between "syntactic" prosody (for phrasing) and "emphatic" prosody (for interpretation). While reading with expression appears as a criterion in major…
NASA Astrophysics Data System (ADS)
Altarelli, Fabrizio; Monasson, Rémi; Zamponi, Francesco
2007-02-01
For large clause-to-variable ratios, typical K-SAT instances drawn from the uniform distribution have no solution. We argue, based on statistical mechanics calculations using the replica and cavity methods, that rare satisfiable instances from the uniform distribution are very similar to typical instances drawn from the so-called planted distribution, where instances are chosen uniformly between the ones that admit a given solution. It then follows, from a recent article by Feige, Mossel and Vilenchik (2006 Complete convergence of message passing algorithms for some satisfiability problems Proc. Random 2006 pp 339-50), that these rare instances can be easily recognized (in O(log N) time and with probability close to 1) by a simple message-passing algorithm.
NASA Technical Reports Server (NTRS)
Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)
2001-01-01
The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.
NASA Technical Reports Server (NTRS)
Firstenberg, H.
1971-01-01
The statistics are considered of the Monte Carlo method relative to the interpretation of the NUGAM2 and NUGAM3 computer code results. A numerical experiment using the NUGAM2 code is presented and the results are statistically interpreted.
SOCR: Statistics Online Computational Resource
Dinov, Ivo D.
2011-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741
Chatellier, J.Y.; Gustavo, F.; Magaly, Q.
1996-12-31
Integrating different petroleum geology disciplines gives insight and help in analyzing data and in checking the quality of different interpretations. Simple approaches and affordable programs allow rapid visualization of data in 3-D. Displaying geological data from stratigraphy, diagenesis, and structural geology together, allows identification of anomalies (i.e. development targets) and often gives clues of the controlling processes. Four case studies from world class fields are used to illustrate the vital need to integrate quality control of interpretation across disciplines. Distribution of diagenetic alterations is revealed by visualizing diagenetic and petrographic data against faults in a 3-D statistical program. Faults are transferred from 3-D seismic into such a program and then analyzed against other data. Fault intersections wrongly correlated are also easily picked. Other powerful tools include a modified use of the Bischke Plots that allow the identification of missing sections previously identified as fault cut-outs. The quality of interpretation has sometimes been assessed from the presence of stacked anomalies of various expression. In other cases repeated unexpected isopach trends revealed subtle faults such as riedels sealing and compartmentizing the reservoirs. Occasionally the timing of fault reactivation was assessed precisely whereas all other techniques failed even to identify these hidden features. Unrecognized porosity-depth trends were identified after filtering data for stratigraphy or sedimentology and studying it in its geographical and tectonic context. Three dimensional visualization was needed in cases of quartz overgrowth where grain size, depth, stratigraphy and location with respect to faults were all important.
Chatellier, J.Y.; Gustavo, F.; Magaly, Q. )
1996-01-01
Integrating different petroleum geology disciplines gives insight and help in analyzing data and in checking the quality of different interpretations. Simple approaches and affordable programs allow rapid visualization of data in 3-D. Displaying geological data from stratigraphy, diagenesis, and structural geology together, allows identification of anomalies (i.e. development targets) and often gives clues of the controlling processes. Four case studies from world class fields are used to illustrate the vital need to integrate quality control of interpretation across disciplines. Distribution of diagenetic alterations is revealed by visualizing diagenetic and petrographic data against faults in a 3-D statistical program. Faults are transferred from 3-D seismic into such a program and then analyzed against other data. Fault intersections wrongly correlated are also easily picked. Other powerful tools include a modified use of the Bischke Plots that allow the identification of missing sections previously identified as fault cut-outs. The quality of interpretation has sometimes been assessed from the presence of stacked anomalies of various expression. In other cases repeated unexpected isopach trends revealed subtle faults such as riedels sealing and compartmentizing the reservoirs. Occasionally the timing of fault reactivation was assessed precisely whereas all other techniques failed even to identify these hidden features. Unrecognized porosity-depth trends were identified after filtering data for stratigraphy or sedimentology and studying it in its geographical and tectonic context. Three dimensional visualization was needed in cases of quartz overgrowth where grain size, depth, stratigraphy and location with respect to faults were all important.
Hold My Calls: An Activity for Introducing the Statistical Process
ERIC Educational Resources Information Center
Abel, Todd; Poling, Lisa
2015-01-01
Working with practicing teachers, this article demonstrates, through the facilitation of a statistical activity, how to introduce and investigate the unique qualities of the statistical process including: formulate a question, collect data, analyze data, and interpret data.
Interpretation of Biosphere Reserves.
ERIC Educational Resources Information Center
Merriman, Tim
1994-01-01
Introduces the Man and the Biosphere Programme (MAB) to monitor the 193 biogeographical provinces of the Earth and the creation of biosphere reserves. Highlights the need for interpreters to become familiar or involved with MAB program activities. (LZ)
ERIC Educational Resources Information Center
Smith, P. Sean; Ford, Brent A.
1994-01-01
Presents a brief introduction of our atmosphere, a guide to reading and interpreting weather maps, and a set of activities to facilitate teachers in helping to enhance student understanding of the Earth's atmosphere. (ZWH)
BIOMONITORING: INTERPRETATION AND USES
With advanced technologies, it is now possible to measure very low levels of many chemicals in biological fluids. However, the appropriate use and interpretation of biomarkers will depend upon many factors associated with the exposure, adsorption, deposition, metabolism, and eli...
NASA Astrophysics Data System (ADS)
Burns, T. J.; Swanson, E. S.
2016-09-01
A variety of options for interpreting the DØ state, X (5568), are examined. We find that threshold, cusp, molecular, and tetraquark models are all unfavoured. Several experimental tests for unravelling the nature of the signal are suggested.
Interpretation of Bernoulli's Equation.
ERIC Educational Resources Information Center
Bauman, Robert P.; Schwaneberg, Rolf
1994-01-01
Discusses Bernoulli's equation with regards to: horizontal flow of incompressible fluids, change of height of incompressible fluids, gases, liquids and gases, and viscous fluids. Provides an interpretation, properties, terminology, and applications of Bernoulli's equation. (MVL)
NASA Astrophysics Data System (ADS)
Fink, Thomas
2015-03-01
We introduce a simple class of distribution networks which withstand damage by being repairable instead of redundant. Instead of asking how hard it is to disconnect nodes through damage, we ask how easy it is to reconnect nodes after damage. We prove that optimal networks on regular lattices have an expected cost of reconnection proportional to the lattice length, and that such networks have exactly three levels of structural hierarchy. We extend our results to networks subject to repeated attacks, in which the repairs themselves must be repairable. We find that, in exchange for a modest increase in repair cost, such networks are able to withstand any number of attacks. We acknowledge support from the Defense Threat Reduction Agency, BCG and EU FP7 (Growthcom).
Staining bacterial flagella easily.
Heimbrook, M E; Wang, W L; Campbell, G
1989-01-01
A wet-mount technique for staining bacterial flagella is highly successful when a stable stain and regular slides and cover slips are used. Although not producing a permanent mount, the technique is simple for routine use when the number and arrangement of flagella are critical in identifying species of motile bacteria. Images PMID:2478573
Large, Easily Deployable Structures
NASA Technical Reports Server (NTRS)
Agan, W. E.
1983-01-01
Study of concepts for large space structures will interest those designing scaffolding, radio towers, rescue equipment, and prefabricated shelters. Double-fold, double-cell module was selected for further design and for zero gravity testing. Concept is viable for deployment by humans outside space vehicle as well as by remotely operated manipulator.
Interpreter-mediated dentistry.
Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F
2015-05-01
The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals. PMID:25828074
Interpreter-mediated dentistry.
Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F
2015-05-01
The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals.
The National Lakes Assessment (NLA) and other lake survey and monitoring efforts increasingly rely upon biological assemblage data to define lake condition. Information concerning the multiple dimensions of physical and chemical habitat is necessary to interpret this biological ...
Screencast Tutorials Enhance Student Learning of Statistics
ERIC Educational Resources Information Center
Lloyd, Steven A.; Robertson, Chuck L.
2012-01-01
Although the use of computer-assisted instruction has rapidly increased, there is little empirical research evaluating these technologies, specifically within the context of teaching statistics. The authors assessed the effect of screencast tutorials on learning outcomes, including statistical knowledge, application, and interpretation. Students…
Graphs and Statistics: A Resource Handbook.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Bureau of General Education Curriculum Development.
Graphical representation of statistical data is the focus of this resource handbook. Only graphs which present numerical information are discussed. Activities involving the making, interpreting, and use of various types of graphs and tables are included. Sections are also included which discuss statistical terms, normal distribution and…
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho
2013-11-01
Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination.
NASA Astrophysics Data System (ADS)
Li, Shijie; Zhang, Lisha; Wang, Huanli; Chen, Zhigang; Hu, Junqing; Xu, Kaibing; Liu, Jianshe
2014-02-01
Traditional nanosized photocatalysts usually have high photocatalytic activity but can not be efficiently recycled. Film-shaped photocatalysts on the substrates can be easily recycled, but they have low surface area and/or high production cost. To solve these problems, we report on the design and preparation of efficient and easily recyclable macroscale photocatalysts with nanostructure by using Ta3N5 as a model semiconductor. Ta3N5-Pt nonwoven cloth has been prepared by an electrospinning-calcination-nitridation-wet impregnation method, and it is composed of Ta3N5 fibers with diameter of 150-200 nm and hierarchical pores. Furthermore, these fibers are constructed from Ta3N5 nanoparticles with diameter of ~25 nm which are decorated with Pt nanoparticles with diameter of ~2.5 nm. Importantly, Ta3N5-Pt cloth can be used as an efficient and easily recyclable macroscale photocatalyst with wide visible-light response, for the degradation of methylene blue and parachlorophenol, probably resulting in a very promising application as ``photocatalyst dam'' for the polluted river.
Li, Shijie; Zhang, Lisha; Wang, Huanli; Chen, Zhigang; Hu, Junqing; Xu, Kaibing; Liu, Jianshe
2014-01-01
Traditional nanosized photocatalysts usually have high photocatalytic activity but can not be efficiently recycled. Film-shaped photocatalysts on the substrates can be easily recycled, but they have low surface area and/or high production cost. To solve these problems, we report on the design and preparation of efficient and easily recyclable macroscale photocatalysts with nanostructure by using Ta3N5 as a model semiconductor. Ta3N5-Pt nonwoven cloth has been prepared by an electrospinning-calcination-nitridation-wet impregnation method, and it is composed of Ta3N5 fibers with diameter of 150–200 nm and hierarchical pores. Furthermore, these fibers are constructed from Ta3N5 nanoparticles with diameter of ~25 nm which are decorated with Pt nanoparticles with diameter of ~2.5 nm. Importantly, Ta3N5-Pt cloth can be used as an efficient and easily recyclable macroscale photocatalyst with wide visible-light response, for the degradation of methylene blue and parachlorophenol, probably resulting in a very promising application as “photocatalyst dam” for the polluted river. PMID:24496147
Ranald Macdonald and statistical inference.
Smith, Philip T
2009-05-01
Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing. PMID:19351454
Linking numbers, spin, and statistics of solitons
NASA Technical Reports Server (NTRS)
Wilczek, F.; Zee, A.
1983-01-01
The spin and statistics of solitons in the (2 + 1)- and (3 + 1)-dimensional nonlinear sigma models is considered. For the (2 + 1)-dimensional case, there is the possibility of fractional spin and exotic statistics; for 3 + 1 dimensions, the usual spin-statistics relation is demonstrated. The linking-number interpretation of the Hopf invariant and the use of suspension considerably simplify the analysis.
Hospitals as interpretation systems.
Thomas, J B; McDaniel, R R; Anderson, R A
1991-01-01
In this study of 162 hospitals, it was found that the chief executive officer's (CEO's) interpretation of strategic issues is related to the existing hospital strategy and the hospital's information processing structure. Strategy was related to interpretation in terms of the extent to which a given strategic issue was perceived as controllable or uncontrollable. Structure was related to the extent to which an issue was defined as positive or negative, was labeled as controllable or uncontrollable, and was perceived as leading to a gain or a loss. Together, strategy and structure accounted for a significant part of the variance in CEO interpretations of strategic events. The theoretical and managerial implications of these findings are discussed. PMID:1991677
Statistical physics and ecology
NASA Astrophysics Data System (ADS)
Volkov, Igor
This work addresses the applications of the methods of statistical physics to problems in population ecology. A theoretical framework based on stochastic Markov processes for the unified neutral theory of biodiversity is presented and an analytical solution for the distribution of the relative species abundance distribution both in the large meta-community and in the small local community is obtained. It is shown that the framework of the current neutral theory in ecology can be easily generalized to incorporate symmetric density dependence. An analytically tractable model is studied that provides an accurate description of beta-diversity and exhibits novel scaling behavior that leads to links between ecological measures such as relative species abundance and the species area relationship. We develop a simple framework that incorporates the Janzen-Connell, dispersal and immigration effects and leads to a description of the distribution of relative species abundance, the equilibrium species richness, beta-diversity and the species area relationship, in good accord with data. Also it is shown that an ecosystem can be mapped into an unconventional statistical ensemble and is quite generally tuned in the vicinity of a phase transition where bio-diversity and the use of resources are optimized. We also perform a detailed study of the unconventional statistical ensemble, in which, unlike in physics, the total number of particles and the energy are not fixed but bounded. We show that the temperature and the chemical potential play a dual role: they determine the average energy and the population of the levels in the system and at the same time they act as an imbalance between the energy and population ceilings and the corresponding average values. Different types of statistics (Boltzmann, Bose-Einstein, Fermi-Dirac and one corresponding to the description of a simple ecosystem) are considered. In all cases, we show that the systems may undergo a first or a second order
Explaining the Interpretive Mind.
ERIC Educational Resources Information Center
Brockmeier, Jens
1996-01-01
Examines two prominent positions in the epistemological foundations of psychology--Piaget's causal explanatory claims and Vygotsky's interpretive understanding; contends that they need to be placed in their wider philosophical contexts. Argues that the danger of causally explaining cultural practices through which human beings construct and…
Psychosemantics and Simultaneous Interpretation.
ERIC Educational Resources Information Center
Le Ny, Jean-Francois
A comprehension model of simultaneous interpretation activity raises three types of problems: structure of semantic information stored in long-term memory, modalities of input processing and specific restrictions due to situation. A useful concept of semantic mnesic structures includes: (1) a componential-predicative lexicon; (2) a propositional…
Interpreting the Constitution.
ERIC Educational Resources Information Center
Brennan, William J., Jr.
1987-01-01
Discusses constitutional interpretations relating to capital punishment and protection of human dignity. Points out the document's effectiveness in creating a new society by adapting its principles to current problems and needs. Considers two views of the Constitution that lead to controversy over the legitimacy of judicial decisions. (PS)
Listening and Message Interpretation
ERIC Educational Resources Information Center
Edwards, Renee
2011-01-01
Message interpretation, the notion that individuals assign meaning to stimuli, is related to listening presage, listening process, and listening product. As a central notion of communication, meaning includes (a) denotation and connotation, and (b) content and relational meanings, which can vary in ambiguity and vagueness. Past research on message…
Interpretation and containment.
Lafarge, L
2000-02-01
The author explores two aspects of the analyst's effort to imagine the inner world of his patient and the way that they are manifest in the clinical moment. The first of these is the analyst's recognition and interpretation of his patient's elaborated fantasies. This current of the analyst's imagination is most often evoked by the patient's communication of whole-object transferences, which occurs largely in his verbal associations. The second is the analyst's reception and transformation of his patient's primitive emotional experience, a process that Bion has called containment. This second imaginative current is most often evoked by the patient's communication of part-object transferences, which occurs largely in affect and action. Interpretation and containment both go on at once in clinical work, although one or the other is usually dominant. Attention to the interplay of interpretation and containment in the clinical moment enables us to identify the articulation of whole- and part-object transferences and to integrate ego-psychological and Kleinian frames of reference in clinical work. In addition, the concept of mutual containment opens Kleinian theory to the possibility of a two-person psychology in which the roles of analyst and patient are more symmetrical than they are usually conceived to be within this frame of reference. The author presents two clinical examples to demonstrate the interplay of interpretation and containment. In the first, these processes operate smoothly. In the second, the process of containment is strained but ultimately successful.
Interpretation of Image Content.
ERIC Educational Resources Information Center
Pettersson, Rune
1988-01-01
Describes experiments and studies which investigated perception and image interpretation on different cognitive levels. Subjects were asked to name, describe, index, and assess image contents; write legends; create images; complete stories; illustrate stories; and produce informative materials. Results confirmed the theory of a dual stage…
Interpreting Contradictory Communications.
ERIC Educational Resources Information Center
Lightfoot, Cynthia
Preschool children, elementary school students, and adults participated in a study that examined various processes used to interpret contradictory communications. A screening test determined that all subjects were capable of discriminating between contradictory and congruent communications. Subjects were presented with contradictory verbal-facial…
Interpreting & Biomechanics. PEPNet Tipsheet
ERIC Educational Resources Information Center
PEPNet-Northeast, 2001
2001-01-01
Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…
Cosmetic Plastic Surgery Statistics
2014 Cosmetic Plastic Surgery Statistics Cosmetic Procedure Trends 2014 Plastic Surgery Statistics Report Please credit the AMERICAN SOCIETY OF PLASTIC SURGEONS when citing statistical data or using ...
NASA Astrophysics Data System (ADS)
Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho
2013-10-01
Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination.Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination. Electronic supplementary information (ESI) available: Experimental methods for CZTS nanocrystal synthesis, device fabrication, and characterization; the size distribution and energy dispersive X-ray (EDX) spectra of the synthesized CZTS nanoparticles; UV-vis spectra of the
Luo,Y.; Tepikian, S.; Fischer, W.; Robert-Demolaize, G.; Trbojevic, D.
2009-01-02
Based on the contributions of the chromatic sextupole families to the half-integer resonance driving terms, we discuss how to sort the chromatic sextupoles in the arcs of the Relativistic Heavy Ion Collider (RHIC) to easily and effectively correct the second order chromaticities. We propose a method with 4 knobs corresponding to 4 pairs of chromatic sextupole families to online correct the second order chromaticities. Numerical simulation justifies this method, showing that this method reduces the unbalance in the correction strengths of sextupole families and avoids the reversal of sextupole polarities. Therefore, this method yields larger dynamic apertures for the proposed RHIC 2009 100GeV polarized proton run lattices.
1999-09-30
The Institute studied the adsorption of cationic pressure-sensitive adhesive (PSA) on wood fiber, and the buildup of PSA in a closed water system during paper recycling; the results are presented. Georgia Tech worked to develop an environmentally friendly polymerization process to synthesize a novel re-dispersible PSA by co-polymerizing an oil-soluble monomer (butyl acrylate) and a cationic monomer MAEPTAC; results are presented. At the University of Georgia at Athens the project focused on the synthesis of water-soluble and easily removable cationic polymer PSAs.
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific joumals such asThe journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
Phillips, Craig B; Iline, Ilia I; Richards, Nicola K; Novoselov, Max; McNeill, Mark R
2013-10-01
Quickly, accurately, and easily assessing the efficacy of treatments to control sessile arthropods (e.g., scale insects) and stationary immature life stages (e.g., eggs and pupae) is problematic because it is difficult to tell whether treated organisms are alive or dead. Current approaches usually involve either maintaining organisms in the laboratory to observe them for development, gauging their response to physical stimulation, or assessing morphological characters such as turgidity and color. These can be slow, technically difficult, or subjective, and the validity of methods other than laboratory rearing has seldom been tested. Here, we describe development and validation of a quick easily used biochemical colorimetric assay for measuring the viability of arthropods that is sufficiently sensitive to test even very small organisms such as white fly eggs. The assay was adapted from a technique for staining the enzyme hexokinase to signal the presence of adenosine triphosphate in viable specimens by reducing a tetrazolium salt to formazan. Basic laboratory facilities and skills are required for production of the stain, but no specialist equipment, expertise, or facilities are needed for its use.
Rago, Angela; Latagliata, Roberto; Montanaro, Marco; Montefusco, Enrico; Andriani, Alessandro; Crescenzi, Sabrina Leonetti; Mecarocci, Sergio; Spirito, Francesca; Spadea, Antonio; Recine, Umberto; Cicconi, Laura; Avvisati, Giuseppe; Cedrone, Michele; Breccia, Massimo; Porrini, Raffaele; Villivà, Nicoletta; De Gregoris, Cinzia; Alimena, Giuliana; D'Arcangelo, Enzo; Guglielmelli, Paola; Lo-Coco, Francesco; Vannucchi, Alessandro; Cimino, Giuseppe
2015-03-01
To predict leukemic transformation (LT), we evaluated easily detectable diagnostic parameters in 338 patients with primary myelofibrosis (PMF) followed in the Latium region (Italy) between 1981 and 2010. Forty patients (11.8%) progressed to leukemia, with a resulting 10-year leukemia-free survival (LFS) rates of 72%. Hb (<10g/dL), and circulating blasts (≥1%) were the only two independent prognostic for LT at the multivariate analysis. Two hundred-fifty patients with both the two parameters available were grouped as follows: low risk (none or one factor)=216 patients; high risk (both factors)=31 patients. The median LFS times were 269 and 45 months for the low and high-risk groups, respectively (P<.0001). The LT predictive power of these two parameters was confirmed in an external series of 270 PMF patients from Tuscany, in whom the median LFS was not reached and 61 months for the low and high risk groups, respectively (P<.0001). These results establish anemia and circulating blasts, two easily and universally available parameters, as strong predictors of LT in PMF and may help to improve prognostic stratification of these patients particularly in countries with low resources where more sophisticated molecular testing is unavailable. PMID:25636356
Phillips, Craig B; Iline, Ilia I; Richards, Nicola K; Novoselov, Max; McNeill, Mark R
2013-10-01
Quickly, accurately, and easily assessing the efficacy of treatments to control sessile arthropods (e.g., scale insects) and stationary immature life stages (e.g., eggs and pupae) is problematic because it is difficult to tell whether treated organisms are alive or dead. Current approaches usually involve either maintaining organisms in the laboratory to observe them for development, gauging their response to physical stimulation, or assessing morphological characters such as turgidity and color. These can be slow, technically difficult, or subjective, and the validity of methods other than laboratory rearing has seldom been tested. Here, we describe development and validation of a quick easily used biochemical colorimetric assay for measuring the viability of arthropods that is sufficiently sensitive to test even very small organisms such as white fly eggs. The assay was adapted from a technique for staining the enzyme hexokinase to signal the presence of adenosine triphosphate in viable specimens by reducing a tetrazolium salt to formazan. Basic laboratory facilities and skills are required for production of the stain, but no specialist equipment, expertise, or facilities are needed for its use. PMID:24224241
12 CFR Supplement I to Part 1002 - Official Interpretations
Code of Federal Regulations, 2013 CFR
2013-01-01
... interpretation of Regulation B (12 CFR part 1002) issued by the Bureau of Consumer Financial Protection... covered by Regulation Z (Truth in Lending) (12 CFR part 1026). Further, the definition of creditor is not... continues to meet recognized professional statistical standards for statistical soundness. To ensure...
12 CFR Supplement I to Part 1002 - Official Interpretations
Code of Federal Regulations, 2014 CFR
2014-01-01
... interpretation of Regulation B (12 CFR part 1002) issued by the Bureau of Consumer Financial Protection... covered by Regulation Z (Truth in Lending) (12 CFR part 1026). Further, the definition of creditor is not... continues to meet recognized professional statistical standards for statistical soundness. To ensure...
12 CFR Supplement I to Part 1002 - Official Interpretations
Code of Federal Regulations, 2012 CFR
2012-01-01
... interpretation of Regulation B (12 CFR Part 1002) issued by the Bureau of Consumer Financial Protection... covered by Regulation Z (Truth in Lending) (12 CFR Part 1026). Further, the definition of creditor is not... continues to meet recognized professional statistical standards for statistical soundness. To ensure...
Semantic interpretation of nominalizations
Hull, R.D.; Gomez, F.
1996-12-31
A computational approach to the semantic interpretation of nominalizations is described. Interpretation of normalizations involves three tasks: deciding whether the normalization is being used in a verbal or non-verbal sense; disambiguating the normalized verb when a verbal sense is used; and determining the fillers of the thematic roles of the verbal concept or predicate of the nominalization. A verbal sense can be recognized by the presence of modifiers that represent the arguments of the verbal concept. It is these same modifiers which provide the semantic clues to disambiguate the normalized verb. In the absence of explicit modifiers, heuristics are used to discriminate between verbal and non-verbal senses. A correspondence between verbs and their nominalizations is exploited so that only a small amount of additional knowledge is needed to handle the nominal form. These methods are tested in the domain of encyclopedic texts and the results are shown.
Evaluation of Psychotherapeutic Interpretations
POGGE, DAVID L.; DOUGHER, MICHAEL J.
1992-01-01
If much psychotherapy literature goes unread and unused by therapists, one reason may be the apparent irrelevance of theory-derived hypotheses to actual practice. Methods that uncover tacit knowledge that practicing therapists already possess can provide the empirical basis for more relevant theories and the testing of more meaningful hypotheses. This study demonstrates application of the phenomenological method to the question of evaluating psychotherapy. To discover how experienced psychotherapists evaluate interpretations made in actual psychotherapy sessions, therapists were asked to evaluate such interpretations from videotapes; analysis of responses yielded a set of 10 dimensions of evaluation. Such methods offer both practical utility and a source of theoretical growth anchored in the real world of the practicing therapist. PMID:22700101
Taxonomy of interpretation trees
NASA Astrophysics Data System (ADS)
Flynn, Patrick J.; Jain, Anil K.
1992-02-01
This paper explores alternative models of the interpretation tree (IT), whose search is one of the dominant paradigms for object recognition. Recurrence relations for the unpruned size of eight different types of search tree are introduced. Since exhaustive search of the IT in most recognition systems is impractical, pruning of various types is employed. It is therefore useful to see how much of the IT will be explored in a typical recognition problem. Probabilistic models of the search process have been proposed in the literature and used as a basis for theoretical bounds on search tree size, but experiments on a large number of images suggest that for 3-D object recognition from range data, the error probabilities (assumed to be constant) display significant variation. Hence, the theoretical bounds on the interpretation tree's size can serve only as rough estimates of the computational burden incurred during object recognition.
Interpreting coagulation assays.
Green, David
2010-09-01
The interpretation of coagulation assays requires knowledge of the principal clotting pathways. The activated partial thromboplastin time is sensitive to all hemostatic factors except FVII, whereas the prothrombin time reflects levels of prothrombin and FV, FVII, and FX. Using the two tests in concert is helpful in identifying hemophilia, the coagulopathy of liver disease, and disseminated intravascular coagulation. In addition, the activated partial thromboplastin time and prothrombin time are used for monitoring anticoagulant therapy with heparin and warfarin, respectively. Measurement of D-dimer is informative in patients suspected of having thrombotic disorders and determining the risk of thrombosis recurrence. Mixing tests distinguish clotting factor deficiencies from circulating anticoagulants such as heparin, the lupus anticoagulant, and antibodies directed against specific clotting factors. The modified Bethesda assay detects and provides an indication of the strength of FVIII inhibitors. However, interpreting the results of coagulation assays is not always straightforward, and expert consultation is occasionally required to resolve difficult clinical situations. PMID:20855988
Interpretation and creationism.
Ahumada, J L
1994-08-01
This paper is an attempt to raise questions about certain underlying and implicit assumptions in some hermeneutic and narrative approaches to psychoanalysis. Starting from the view that Freud saw interpretation in the clinical setting as an attempt to unveil the analysand's psychic reality, it is argued that he envisaged that psychoanalysis aims to interpret what is real in the analysand's inner world--an empirical line of thought underpinned by the idea of analytic neutrality and an emphasis on the analysand's capacity to judge reality. By contrast, the tendency within the hermeneutic-narrative tradition is to demote psychic reality in favour of an emphasis on the analyst's capacity to interpret in order to help his analysand construct meaning. This approach may be said to put the analyst's words in the place of those of the Creator; in other words, it amounts to a 'verbal creationism', which the author argues is rooted in the idealistic philosophy of Hegel, Vico and Descartes and, further back, can be traced to the Book of Genesis--a conclusion causing the author to express some reservations. PMID:7989142
Interpreting uncertainty terms.
Holtgraves, Thomas
2014-08-01
Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on.
Connecting q-mutator theory with experimental tests of the spin-statistics connection
NASA Astrophysics Data System (ADS)
Hilborn, Robert C.
2000-11-01
The q-mutator theory is used to connect the value of 1-|q|, the parameter measuring the "difference" between quons and ordinary bosons and fermions, to experiments that test the spin-statistics connection. Such calculations are best carried out using a density matrix formulation because a superselection rule prevents transitions between states associated with different representations of the permutation group. The interpretation of the experimental results, however, in terms of a quantitative limit on 1-|q| can be easily misled by the density matrix formulation. As a concrete example, the theory is applied to a spin-statistics test for photons. The formalism is then applied to spin-statistics tests for electrons in atomic helium and for 16O nuclei in molecules. Finally, the analysis is used to extend experimental limits on composite systems such as 16O nuclei to provide a test of the spin-statistics connection for the constituents of those composite systems (nucleons and quarks in the case of oxygen nuclei).
The disagreeable behaviour of the kappa statistic.
Flight, Laura; Julious, Steven A
2015-01-01
It is often of interest to measure the agreement between a number of raters when an outcome is nominal or ordinal. The kappa statistic is used as a measure of agreement. The statistic is highly sensitive to the distribution of the marginal totals and can produce unreliable results. Other statistics such as the proportion of concordance, maximum attainable kappa and prevalence and bias adjusted kappa should be considered to indicate how well the kappa statistic represents agreement in the data. Each kappa should be considered and interpreted based on the context of the data being analysed.
Enhancing the Teaching of Statistics: Portfolio Theory, an Application of Statistics in Finance
ERIC Educational Resources Information Center
Christou, Nicolas
2008-01-01
In this paper we present an application of statistics using real stock market data. Most, if not all, students have some familiarity with the stock market (or at least they have heard about it) and therefore can understand the problem easily. It is the real data analysis that students find interesting. Here we explore the building of efficient…
On easily tunable wide-bandpass X-ray monochromators based on refraction in arrays of prisms.
Jark, Werner
2012-07-01
Refractive lenses focus X-rays chromatically owing to a significant variation of the refractive index of the lens material with photon energy. Then, in combination with an exit slit in the focal plane, such lenses can be used as monochromators. The spectral resolution obtainable with refractive lenses based on prism arrays was recently systematically investigated experimentally. This contribution will show that a wide-bandpass performance can be predicted with a rather simple analytical approach. Based on the good agreement with the experimental data, one can then more rapidly and systematically optimize the lens structure for a given application. This contribution will then discuss more flexible solutions for the monochromator operation. It will be shown that a new monochromator scheme could easily provide tuning in a fixed-exit slit.
On easily tunable wide-bandpass X-ray monochromators based on refraction in arrays of prisms.
Jark, Werner
2012-07-01
Refractive lenses focus X-rays chromatically owing to a significant variation of the refractive index of the lens material with photon energy. Then, in combination with an exit slit in the focal plane, such lenses can be used as monochromators. The spectral resolution obtainable with refractive lenses based on prism arrays was recently systematically investigated experimentally. This contribution will show that a wide-bandpass performance can be predicted with a rather simple analytical approach. Based on the good agreement with the experimental data, one can then more rapidly and systematically optimize the lens structure for a given application. This contribution will then discuss more flexible solutions for the monochromator operation. It will be shown that a new monochromator scheme could easily provide tuning in a fixed-exit slit. PMID:22713879
Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
NASA Astrophysics Data System (ADS)
Biass, Sébastien; Frischknecht, Corine; Dell'Oro, Luca; Senegas, Olivier; Bonadonna, Costanza
2010-05-01
In order to answer the needs of contingency planning, we present a GIS-based method for risk assessment of tephra deposits, which is flexible enough to work with datasets of variable precision and resolution depending on data availabilty. Due to the constant increase of population density around volcanoes and the large dispersal of tephra from volcanic plumes, a wide range of threats such as roof collapses, destruction of crops, blockage of vital lifelines and health problems concern even remote communities. In the field of disaster management, there is a general agreement that a global and incomplete method, subject to revision and improvements, is better than no information at all. In this framework, our method is able to provide fast and rough insights on possible eruptive scenarios and their potential consequences on surrounding populations with only few available data, which can easily be refined later. Therefore, the knowledge of both the expected hazard (frequency and magnitude) and the vulnerability of elements at risk are required by planners in order to produce efficient emergency planning prior to a crisis. The Cotopaxi volcano, one of Ecuador's most active volcanoes, was used to develop and test this method. Cotopaxi volcano is located 60 km south of Quito and threatens a highly populated valley. Based on field data, historical reports and the Smithsonian catalogue, our hazard assessment was carried out using the numerical model TEPHRA2. We first applied a deterministic approach that evolved towards a fully probabilistic method in order to account for the most likely eruptive scenarios as well as the variability of atmospheric conditions. In parallel, we carried out a vulnerability assessment of the physical (crops and roofs), social (populations) and systemic elements-at-risk by using mainly free and easily accessible data. Both hazard and vulnerability assessments were compiled with GIS tools to draw comprehensive and tangible thematic risk maps
Cai, Liu-xin; Wei, Fang-qiang; Yu, Yi-chen; Cai, Xiu-jun
2016-01-01
Objective: The liver hanging maneuver (LHM) is rarely applied in laparoscopic right hepatectomy (LRH) because of the difficulty encountered in retrohepatic tunnel (RT) dissection and tape positioning. Thus far no report has detailed how to quickly and easily establish RT for laparoscopic LHM in LRH, nor has employment of the Goldfinger dissector to create a total RT been reported. This study’s aim was to evaluate the safety and feasibility of establishing RT for laparoscopic LHM using the Goldfinger dissector in LRH. Methods: Between March 2015 and July 2015, five consecutive patients underwent LRH via the caudal approach with laparoscopic LHM. A five-step strategy using the Goldfinger dissector to establish RT for laparoscopic LHM was adopted. Perioperative data were analyzed. Results: The median age of patients was 58 (range, 51–65) years. Surgery was performed for one intrahepatic lithiasis and four hepatocellular carcinomas with a median size of 90 (40–150) mm. The median operative time was 320 (282–358) min with a median blood loss of 200 (200–600) ml. Laparoscopic LHM was achieved in a median of 31 (21–62) min, and the median postoperative hospital stay was 14 (9–16) d. No transfusion or conversion was required, and no severe liver-related morbidity or death was observed. Conclusions: The Goldfinger dissector is a useful instrument for the establishment of RT. A five-step strategy using the Goldfinger dissector can quickly and easily facilitate an RT for a laparoscopic LHM in LRH. PMID:27604863
How to use and interpret hormone ratios.
Sollberger, Silja; Ehlert, Ulrike
2016-01-01
Hormone ratios have become increasingly popular throughout the neuroendocrine literature since they offer a straightforward way to simultaneously analyze the effects of two interdependent hormones. However, the analysis of ratios is associated with statistical and interpretational concerns which have not been sufficiently considered in the context of endocrine research. The aim of this article, therefore, is to demonstrate and discuss these issues, and to suggest suitable ways to address them. In a first step, we use exemplary testosterone and cortisol data to illustrate that one major concern of ratios lies in their distribution and inherent asymmetry. As a consequence, results of parametric statistical analyses are affected by the ultimately arbitrary decision of which way around the ratio is computed (i.e., A/B or B/A). We suggest the use of non-parametric methods as well as the log-transformation of hormone ratios as appropriate methods to deal with these statistical problems. However, in a second step, we also discuss the complicated interpretation of ratios, and propose moderation analysis as an alternative and oftentimes more insightful approach to ratio analysis. In conclusion, we suggest that researchers carefully consider which statistical approach is best suited to investigate reciprocal hormone effects. With regard to the hormone ratio method, further research is needed to specify what exactly this index reflects on the biological level and in which cases it is a meaningful variable to analyze.
Interpreting digoxin concentrations.
Canaday, B R
1992-11-01
In all cases, clinical assessment of the patient is the most critical factor in determining dose and interpreting concentrations. When done accurately, laboratory assessment of drug concentrations represents only one source of information. Serum concentrations must be taken into account along with all other relevant clinical data before one can arrive at appropriate management decisions. They must not be considered in isolation and out of context. If the laboratory report is at variance with your clinical judgment, "it will often be the better part of wisdom to question (or reject) the report." PMID:1442543
Interpretation of extragalactic jets
Norman, M.L.
1985-01-01
The nature of extragalatic radio jets is modeled. The basic hypothesis of these models is that extragalatic jets are outflows of matter which can be described within the framework of fluid dynamics and that the outflows are essentially continuous. The discussion is limited to the interpretation of large-scale (i.e., kiloparsec-scale) jets. The central problem is to infer the physical parameters of the jets from observed distributions of total and polarized intensity and angle of polarization as a function of frequency. 60 refs., 6 figs.
Interpretation of Conventional Mass
NASA Astrophysics Data System (ADS)
Lee, Sungjun; Kim, Kwang Pyo
The conventional mass is not a precise physical quantity but useful virtual one in mass metrology. Because the precise level of conventional mass is related to the OIML class, it is necessary to check if the assignment of weight class is under control. The documents of OIML (International Organization of Legal Metrology) D 28 and R 111 describe the limitation of the quantity in real application. In this presentation, we are trying to interpret and review the concept of conventional mass, for example, by estimating buoyancy deviation and maximum permissible error, in weight calibrations in Korea. Note from Publisher: This article contains the abstract only.
NASA Astrophysics Data System (ADS)
Pospieszalski, M. W.
2010-10-01
The simple noise models of field effect and bipolar transistors reviewed in this article are quite useful in engineering practice, as illustrated by measured and modeled results. The exact and approximate expressions for the noise parameters of FETs and bipolar transistors reveal certain common noise properties and some general noise properties of both devices. The usefulness of these expressions in interpreting the dependence of measured noise parameters on frequency, bias, and temperature and, consequently, in checking of consistency of measured data has been demonstrated.
Predict! Teaching Statistics Using Informational Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
Physical interpretation of antigravity
NASA Astrophysics Data System (ADS)
Bars, Itzhak; James, Albin
2016-02-01
Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl-invariant Standard Model coupled to general relativity (SM +GR ), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated, but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and out signals that interact with the antigravity region. This is no different from a spacetime black box for which the information about its interior is encoded in scattering amplitudes for in/out states at its exterior. Through examples we show that negative kinetic energy in antigravity presents no problems of principles but is an interesting topic for physical investigations of fundamental significance.
Monitoring and interpreting bioremediation effectiveness
Bragg, J.R.; Prince, R.C.; Harner, J.; Atlas, R.M.
1993-12-31
Following the Exxon Valdez oil spill in 1989, extensive research was conducted by the US Environments Protection Agency and Exxon to develop and implement bioremediation techniques for oil spill cleanup. A key challenge of this program was to develop effective methods for monitoring and interpreting bioremediation effectiveness on extremely heterogenous intertidal shorelines. Fertilizers were applied to shorelines at concentrations known to be safe, and effectiveness achieved in acceleration biodegradation of oil residues was measure using several techniques. This paper describes the most definitive method identified, which monitors biodegradation loss by measuring changes in ratios of hydrocarbons to hopane, a cycloalkane present in the oil that showed no measurable degradation. Rates of loss measured by the hopane ratio method have high levels of statistical confidence, and show that the fertilizer addition stimulated biodegradation rates as much a fivefold. Multiple regression analyses of data show that fertilizer addition of nitrogen in interstitial pore water per unit of oil load was the most important parameter affecting biodegradation rate, and results suggest that monitoring nitrogen concentrations in the subsurface pore water is preferred technique for determining fertilizer dosage and reapplication frequency.
Generic interpreters and microprocessor verification
NASA Technical Reports Server (NTRS)
Windley, Phillip J.
1990-01-01
The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.
Asynchronous interpretation of parallel microprograms
Bandman, O.L.
1984-03-01
In this article, the authors demonstrate how to pass from a given synchronous interpretation of a parallel microprogram to an equivalent asynchronous interpretation, and investigate the cost associated with the rejection of external synchronization in parallel microprogram structures.
Laws and chances in statistical mechanics
NASA Astrophysics Data System (ADS)
Winsberg, Eric
Statistical mechanics involves probabilities. At the same time, most approaches to the foundations of statistical mechanics-programs whose goal is to understand the macroscopic laws of thermal physics from the point of view of microphysics-are classical; they begin with the assumption that the underlying dynamical laws that govern the microscopic furniture of the world are (or can without loss of generality be treated as) deterministic. This raises some potential puzzles about the proper interpretation of these probabilities.
Ding, Yaobin; Tang, Hebin; Zhang, Shenghua; Wang, Songbo; Tang, Heqing
2016-11-01
Microscaled CuFeO2 particles (micro-CuFeO2) were rapidly prepared via a microwave-assisted hydrothermal method and characterized by scanning electron microscopy, X-ray powder diffraction and X-ray photoelectron spectroscopy. It was found that the micro-CuFeO2 was of pure phase and a rhombohedral structure with size in the range of 2.8±0.6μm. The micro-CuFeO2 efficiently catalyzed the activation of peroxymonosulfate (PMS) to generate sulfate radicals (SO4-), causing the fast degradation of carbamazepine (CBZ). The catalytic activity of micro-CuFeO2 was observed to be 6.9 and 25.3 times that of micro-Cu2O and micro-Fe2O3, respectively. The enhanced activity of micro-CuFeO2 for the activation of PMS was confirmed to be attributed to synergistic effect of surface bonded Cu(I) and Fe(III). Sulfate radical was the primary radical species responsible for the CBZ degradation. As a microscaled catalyst, micro-CuFeO2 can be easily recovered by gravity settlement and exhibited improved catalytic stability compared with micro-Cu2O during five successive degradation cycles. Oxidative degradation of CBZ by the couple of PMS/CuFeO2 was effective in the studied actual aqueous environmental systems.
NASA Astrophysics Data System (ADS)
Mesci, Gunkut; Schwartz, Renee'S.
2016-02-01
The purpose of this study was to assess preservice teachers' views of Nature of Science (NOS), identify aspects that were challenging for conceptual change, and explore reasons why. This study particularly focused on why and how some concepts of NOS may be more easily altered than others. Fourteen preservice science teachers enrolled in a NOS and Science Inquiry course participated in this study. Data were collected by using a pre/post format with the Views of Nature of Science questionnaire (VNOS-270), the Views of Scientific Inquiry questionnaire (VOSI-270), follow-up interviews, and classroom artifacts. The results indicated that most students initially held naïve views about certain aspects of NOS like tentativeness and subjectivity. By the end of the semester, almost all students dramatically improved their understanding about almost all aspects of NOS. However, several students still struggled with certain aspects like the differences between scientific theory and law, tentativeness, and socio-cultural embeddedness. Results suggested that instructional, motivational, and socio-cultural factors may influence if and how students changed their views about targeted NOS aspects. Students thought that classroom activities, discussions, and readings were most helpful to improve their views about NOS. The findings from the research have the potential to translate as practical advice for teachers, science educators, and future researchers.
Rotstein, Benjamin H; Liang, Steven H; Placzek, Michael S; Hooker, Jacob M; Gee, Antony D; Dollé, Frédéric; Wilson, Alan A; Vasdev, Neil
2016-08-22
The positron-emitting radionuclide carbon-11 ((11)C, t1/2 = 20.3 min) possesses the unique potential for radiolabeling of any biological, naturally occurring, or synthetic organic molecule for in vivo positron emission tomography (PET) imaging. Carbon-11 is most often incorporated into small molecules by methylation of alcohol, thiol, amine or carboxylic acid precursors using [(11)C]methyl iodide or [(11)C]methyl triflate (generated from [(11)C]carbon dioxide or [(11)C]methane). Consequently, small molecules that lack an easily substituted (11)C-methyl group are often considered to have non-obvious strategies for radiolabeling and require a more customized approach. [(11)C]Carbon dioxide itself, [(11)C]carbon monoxide, [(11)C]cyanide, and [(11)C]phosgene represent alternative reactants to enable (11)C-carbonylation. Methodologies developed for preparation of (11)C-carbonyl groups have had a tremendous impact on the development of novel PET tracers and provided key tools for clinical research. (11)C-Carbonyl radiopharmaceuticals based on labeled carboxylic acids, amides, carbamates and ureas now account for a substantial number of important imaging agents that have seen translation to higher species and clinical research of previously inaccessible targets, which is a testament to the creativity, utility and practicality of the underlying radiochemistry. PMID:27276357
Williams, D L; Kowalski, D
1993-01-01
The Epstein-Barr virus (EBV) origin of plasmid replication (oriP) includes two known cis-acting components, the dyad symmetry region and the family of repeats. We used P1 nuclease, a single-strand-specific endonuclease, to probe EBV oriP for DNA sequences that are intrinsically easy to unwind on a negatively supercoiled plasmid. Selective nuclease hypersensitivity was detected in the family of repeats on an oriP-containing plasmid and in the dyad symmetry region on a plasmid that lacks the family of repeats, indicating that the DNA in both cis-acting components is intrinsically easy to unwind. The hierarchy of nuclease hypersensitivity indicates that the family of repeats is more easily unwound than the dyad symmetry region, consistent with the hierarchy of helical stability predicted by computer analysis of the DNA sequence. A specific subset of the family of repeats is nuclease hypersensitive, and the DNA structure deduced from nucleotide-level analysis of the P1 nuclease nicks is a cruciform near a single-stranded bubble. The dyad symmetry region unwinds to form a broad single-stranded bubble containing hairpins in the 65-bp dyad sequence. We propose that the intrinsic ease of unwinding the dyad symmetry region, the actual origin of DNA replication, is an important component in the mechanism of initiation. Images PMID:8386273
Improve MWD data interpretation
Santley, D.J.; Ardrey, W.E.
1987-01-01
This article reports that measurement-while-drilling (MWD) technology is being used today in a broad range of real-time drilling applications. In its infancy, MWD was limited to providing directional survey and steering information. Today, the addition of formation sensors (resistivity, gamma) and drilling efficiency sensors (WOB, torque) has made MWD a much more useful drilling decision tool. In the process, the desirability of combining downhole MWD data with powerful analytical software and interpretive techniques has been recognized by both operators and service companies. However, the usual form in which MWD and wellsite analytical capabilities are combined leaves much to be desired. The most common approach is to incorporate MWD with large-scale computerized mud logging (CML) systems. Essentially, MWD decoding and display equipment is added to existing full-blown CML surface units.
NASA Astrophysics Data System (ADS)
Moore, Gregory F.
2009-05-01
This volume is a brief introduction aimed at those who wish to gain a basic and relatively quick understanding of the interpretation of three-dimensional (3-D) seismic reflection data. The book is well written, clearly illustrated, and easy to follow. Enough elementary mathematics are presented for a basic understanding of seismic methods, but more complex mathematical derivations are avoided. References are listed for readers interested in more advanced explanations. After a brief introduction, the book logically begins with a succinct chapter on modern 3-D seismic data acquisition and processing. Standard 3-D acquisition methods are presented, and an appendix expands on more recent acquisition techniques, such as multiple-azimuth and wide-azimuth acquisition. Although this chapter covers the basics of standard time processing quite well, there is only a single sentence about prestack depth imaging, and anisotropic processing is not mentioned at all, even though both techniques are now becoming standard.
Cytological artifacts masquerading interpretation
Sahay, Khushboo; Mehendiratta, Monica; Rehani, Shweta; Kumra, Madhumani; Sharma, Rashi; Kardam, Priyanka
2013-01-01
Background: Cytological artifacts are important to learn because an error in routine laboratory practice can bring out an erroneous result. Aims: The aim of this study was to analyze the effects of delayed fixation and morphological discrepancies created by deliberate addition of extraneous factors on the interpretation and/or diagnosis of an oral cytosmear. Materials and Methods: A prospective study was carried out using papanicolaou and hematoxylin and eosin-stained oral smears, 6 each from 66 volunteer dental students with deliberate variation in fixation delay timings, with and without changes in temperature, undue pressure while smear making and intentional addition of contaminants. The fixation delay at room temperature was carried out at an interval of every 30 minutes, 1 day and 1 week and was continued till the end of 1 day, 1 week, and 1 month, respectively. The temperature variations included 60 to 70°C and 3 to 4°C. Results: Light microscopically, the effect of delayed fixation at room temperature appeared first on cytoplasm followed by nucleus within the first 2 hours and on the 4th day, respectively, till complete cytoplasmic degeneration on the 23rd day. However, delayed fixation at variable temperature brought faster degenerative changes at higher temperature than lower temperature. Effect of extraneous factors revealed some interesting facts. Conclusions: In order to justify a cytosmear interpretation, a cytologist must be well acquainted with delayed fixation-induced cellular changes and microscopic appearances of common contaminants so as to implicate better prognosis and therapy. PMID:24648667
Statistical methods for material characterization and qualification
Hunn, John D; Kercher, Andrew K
2005-01-01
This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.
Statistical Methods for Material Characterization and Qualification
Kercher, A.K.
2005-04-01
This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.
Applications of Statistical Tests in Hand Surgery
Song, Jae W.; Haas, Ann; Chung, Kevin C.
2015-01-01
During the nineteenth century, with the emergence of public health as a goal to improve hygiene and conditions of the poor, statistics established itself as a distinct scientific field important for critically interpreting studies of public health concerns. During the twentieth century, statistics began to evolve mathematically and methodologically with hypothesis testing and experimental design. Today, the design of medical experiments centers around clinical trials and observational studies, and with the use of statistics, the collected data are summarized, weighed, and presented to direct both physicians and the public towards Evidence-Based Medicine. Having a basic understanding of statistics is mandatory in evaluating the validity of published literature and applying it to patient care. In this review, we aim to apply a practical approach in discussing basic statistical tests by providing a guide to choosing the correct statistical test along with examples relevant to hand surgery research. PMID:19969193
Foolad, Mahsa; Ong, Say Leong; Hu, Jiangyong
2015-11-01
Pharmaceutical and personal care products (PPCPs) and artificial sweeteners (ASs) are emerging organic contaminants (EOCs) in the aquatic environment. The presence of PPCPs and ASs in water bodies has an ecologic potential risk and health concern. Therefore, it is needed to detect the pollution sources by understanding the transport behavior of sewage molecular markers in a subsurface area. The aim of this study was to evaluate transport of nine selected molecular markers through saturated soil column experiments. The selected sewage molecular markers in this study were six PPCPs including acetaminophen (ACT), carbamazepine (CBZ), caffeine (CF), crotamiton (CTMT), diethyltoluamide (DEET), salicylic acid (SA) and three ASs including acesulfame (ACF), cyclamate (CYC), and saccharine (SAC). Results confirmed that ACF, CBZ, CTMT, CYC and SAC were suitable to be used as sewage molecular markers since they were almost stable against sorption and biodegradation process during soil column experiments. In contrast, transport of ACT, CF and DEET were limited by both sorption and biodegradation processes and 100% removal efficiency was achieved in the biotic column. Moreover, in this study the effect of different acetate concentration (0-100mg/L) as an easily biodegradable primary substrate on a removal of PPCPs and ASs was also studied. Results showed a negative correlation (r(2)>0.75) between the removal of some selected sewage chemical markers including ACF, CF, ACT, CYC, SAC and acetate concentration. CTMT also decreased with the addition of acetate, but increasing acetate concentration did not affect on its removal. CBZ and DEET removal were not dependent on the presence of acetate. PMID:26210019
Kokal, Idil; Engel, Annerose; Kirschner, Sebastian; Keysers, Christian
2011-01-01
Why does chanting, drumming or dancing together make people feel united? Here we investigate the neural mechanisms underlying interpersonal synchrony and its subsequent effects on prosocial behavior among synchronized individuals. We hypothesized that areas of the brain associated with the processing of reward would be active when individuals experience synchrony during drumming, and that these reward signals would increase prosocial behavior toward this synchronous drum partner. 18 female non-musicians were scanned with functional magnetic resonance imaging while they drummed a rhythm, in alternating blocks, with two different experimenters: one drumming in-synchrony and the other out-of-synchrony relative to the participant. In the last scanning part, which served as the experimental manipulation for the following prosocial behavioral test, one of the experimenters drummed with one half of the participants in-synchrony and with the other out-of-synchrony. After scanning, this experimenter "accidentally" dropped eight pencils, and the number of pencils collected by the participants was used as a measure of prosocial commitment. Results revealed that participants who mastered the novel rhythm easily before scanning showed increased activity in the caudate during synchronous drumming. The same area also responded to monetary reward in a localizer task with the same participants. The activity in the caudate during experiencing synchronous drumming also predicted the number of pencils the participants later collected to help the synchronous experimenter of the manipulation run. In addition, participants collected more pencils to help the experimenter when she had drummed in-synchrony than out-of-synchrony during the manipulation run. By showing an overlap in activated areas during synchronized drumming and monetary reward, our findings suggest that interpersonal synchrony is related to the brain's reward system. PMID:22110623
Foolad, Mahsa; Ong, Say Leong; Hu, Jiangyong
2015-11-01
Pharmaceutical and personal care products (PPCPs) and artificial sweeteners (ASs) are emerging organic contaminants (EOCs) in the aquatic environment. The presence of PPCPs and ASs in water bodies has an ecologic potential risk and health concern. Therefore, it is needed to detect the pollution sources by understanding the transport behavior of sewage molecular markers in a subsurface area. The aim of this study was to evaluate transport of nine selected molecular markers through saturated soil column experiments. The selected sewage molecular markers in this study were six PPCPs including acetaminophen (ACT), carbamazepine (CBZ), caffeine (CF), crotamiton (CTMT), diethyltoluamide (DEET), salicylic acid (SA) and three ASs including acesulfame (ACF), cyclamate (CYC), and saccharine (SAC). Results confirmed that ACF, CBZ, CTMT, CYC and SAC were suitable to be used as sewage molecular markers since they were almost stable against sorption and biodegradation process during soil column experiments. In contrast, transport of ACT, CF and DEET were limited by both sorption and biodegradation processes and 100% removal efficiency was achieved in the biotic column. Moreover, in this study the effect of different acetate concentration (0-100mg/L) as an easily biodegradable primary substrate on a removal of PPCPs and ASs was also studied. Results showed a negative correlation (r(2)>0.75) between the removal of some selected sewage chemical markers including ACF, CF, ACT, CYC, SAC and acetate concentration. CTMT also decreased with the addition of acetate, but increasing acetate concentration did not affect on its removal. CBZ and DEET removal were not dependent on the presence of acetate.
Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J Eloy; Barsukov, Pavel; Bárta, Jiří; Capek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Santrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas
2014-08-01
Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM ("priming effect"). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze-thaw processes) to additions of (13)C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant
Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J Eloy; Barsukov, Pavel; Bárta, Jiří; Capek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Santrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas
2014-08-01
Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM ("priming effect"). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze-thaw processes) to additions of (13)C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant
Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J. Eloy; Barsukov, Pavel; Bárta, Jiří; Čapek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Šantrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas
2014-01-01
Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM (“priming effect”). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze–thaw processes) to additions of 13C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased
Audiometry screening and interpretation.
Walker, Jennifer Junnila; Cleveland, Leanne M; Davis, Jenny L; Seales, Jennifer S
2013-01-01
The prevalence of hearing loss varies with age, affecting at least 25 percent of patients older than 50 years and more than 50 percent of those older than 80 years. Adolescents and young adults represent groups in which the prevalence of hearing loss is increasing and may therefore benefit from screening. If offered, screening can be performed periodically by asking the patient or family if there are perceived hearing problems, or by using clinical office tests such as whispered voice, finger rub, or audiometry. Audiometry in the family medicine clinic setting is a relatively simple procedure that can be interpreted by a trained health care professional. Pure-tone testing presents tones across the speech spectrum (500 to 4,000 Hz) to determine if the patient's hearing levels fall within normal limits. A quiet testing environment, calibrated audiometric equipment, and appropriately trained personnel are required for in-office testing. Pure-tone audiometry may help physicians appropriately refer patients to an audiologist or otolaryngologist. Unilateral or asymmetrical hearing loss can be symptomatic of a central nervous system lesion and requires additional evaluation. PMID:23317024
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.
Marano, Grazia; Gronewold, Claas; Frank, Martin; Merling, Anette; Kliem, Christian; Sauer, Sandra; Wiessler, Manfred; Frei, Eva
2012-01-01
Summary Oligosaccharides aberrantly expressed on tumor cells influence processes such as cell adhesion and modulation of the cell’s microenvironment resulting in an increased malignancy. Schmidt’s imidate strategy offers an effective method to synthesize libraries of various oligosaccharide mimetics. With the aim to perturb interactions of tumor cells with extracellular matrix proteins and host cells, molecules with 3,4-bis(hydroxymethyl)furan as core structure were synthesized and screened in biological assays for their abilities to interfere in cell adhesion and other steps of the metastatic cascade, such as tumor-induced angiogenesis. The most active compound, (4-{[(β-D-galactopyranosyl)oxy]methyl}furan-3-yl)methyl hydrogen sulfate (GSF), inhibited the activation of matrix-metalloproteinase-2 (MMP-2) as well as migration of the human melanoma cells of the lines WM-115 and WM-266-4 in a two-dimensional migration assay. GSF inhibited completely the adhesion of WM-115 cells to the extracellular matrix (ECM) proteins, fibrinogen and fibronectin. In an in vitro angiogenesis assay with human endothelial cells, GSF very effectively inhibited endothelial tubule formation and sprouting of blood vessels, as well as the adhesion of endothelial cells to ECM proteins. GSF was not cytotoxic at biologically active concentrations; neither were 3,4-bis{[(β-D-galactopyranosyl)oxy]methyl}furan (BGF) nor methyl β-D-galactopyranoside nor 3,4-bis(hydroxymethyl)furan, which were used as controls, eliciting comparable biological activity. In silico modeling experiments, in which binding of GSF to the extracellular domain of the integrin αvβ3 was determined, revealed specific docking of GSF to the same binding site as the natural peptidic ligands of this integrin. The sulfate in the molecule coordinated with one manganese ion in the binding site. These studies show that this chemically easily accessible molecule GSF, synthesized in three steps from 3,4-bis
Somjee, Saika; Yu, Lolie C; Hagar, Arthur F; Hempe, James M
2004-02-01
Hb Iowa is a rare hemoglobin (Hb) variant with a Gly --> Ala substitution at position 119 of beta-globin. It was previously reported only in an African American infant who was also heterozygous for Hb S [beta6(A3)Glu --> Val] and her mother (Hb A/Iowa). Here we describe the second report of Hb Iowa, the first in conjunction with Hb C [beta6(A3)Glu --> Lys]. The patient was an African American girl, originally diagnosed as homozygous Hb C during neonatal screening. When seen in our clinic, hematological data for both the child and her mother (Hb C trait) indicated mild anemia with slightly low mean corpuscular volume (MCV) but normal red blood cell (RBC) count. Analysis of blood from the child by capillary isoelectric focusing (cIEF) identified Hb C and an unknown Hb variant with an isoelectric point (pI) intermediate to that of Hbs F and A. The unknown variant was identified as Hb Iowa by DNA sequence analysis of the beta-globin gene (GGC --> GCC). Both reported cases of Hb Iowa indicated that there are no abnormal hematological manifestations associated with this rare Hb variant. In both cases, however, Hb Iowa was mistaken for Hb F during routine neonatal screening by high performance liquid chromatography (HPLC) and/or gel IEF. Neonatal misidentification of Hb Iowa led to misdiagnosis of sickle cell disease and Hb C disease, respectively. In our patient, Hb Iowa was also misidentified as Hb A at 2 years of age by a commercial reference laboratory using cellulose acetate and citrate agar gel electrophoresis. This led to an incorrect diagnosis of Hb C trait. These results show that commonly used analytical methods can easily misidentify Hb Iowa as Hbs F or A in neonates, or older individuals, resulting in incorrect identification of the Hb phenotype. We conclude that the presence of Hb Iowa, or other variants with similar pIs, should be considered in cases where the results of follow-up testing conflict with neonatal diagnosis of sickle cell or Hb C disease, or
Interpretation and display of research results
Kulkarni, Dilip Kumar
2016-01-01
It important to properly collect, code, clean and edit the data before interpreting and displaying the research results. Computers play a major role in different phases of research starting from conceptual, design and planning, data collection, data analysis and research publication phases. The main objective of data display is to summarize the characteristics of a data and to make the data more comprehensible and meaningful. Usually data is presented depending upon the type of data in different tables and graphs. This will enable not only to understand the data behaviour, but also useful in choosing the different statistical tests to be applied. PMID:27729693
Kutchinsky, B
1991-01-01
from extreme scarcity to relative abundance. If (violent) pornography causes rape, this exceptional development in the availability of (violent) pornography should definitely somehow influence the rape statistics. Since, however, the rape figures could not simply be expected to remain steady during the period in question (when it is well known that most other crimes increased considerably), the development of rape rates was compared with that of non-sexual violent offences and nonviolent sexual offences (in so far as available statistics permitted). The results showed that in none of the countries did rape increase more than nonsexual violent crimes. This finding in itself would seem sufficient to discard the hypothesis that pornography causes rape.(ABSTRACT TRUNCATED AT 400 WORDS) PMID:2032762
Kutchinsky, B
1991-01-01
from extreme scarcity to relative abundance. If (violent) pornography causes rape, this exceptional development in the availability of (violent) pornography should definitely somehow influence the rape statistics. Since, however, the rape figures could not simply be expected to remain steady during the period in question (when it is well known that most other crimes increased considerably), the development of rape rates was compared with that of non-sexual violent offences and nonviolent sexual offences (in so far as available statistics permitted). The results showed that in none of the countries did rape increase more than nonsexual violent crimes. This finding in itself would seem sufficient to discard the hypothesis that pornography causes rape.(ABSTRACT TRUNCATED AT 400 WORDS)
Components of Simultaneous Interpreting: Comparing Interpreting with Shadowing and Paraphrasing
ERIC Educational Resources Information Center
Christoffels, Ingrid K.; de Groot, Annette M. B.
2004-01-01
Simultaneous interpreting is a complex task where the interpreter is routinely involved in comprehending, translating and producing language at the same time. This study assessed two components that are likely to be major sources of complexity in SI: The simultaneity of comprehension and production, and transformation of the input. Furthermore,…
Geochemical Interpretation of Collision Volcanism
NASA Astrophysics Data System (ADS)
Pearce, Julian
2014-05-01
Collision volcanism can be defined as volcanism that takes place during an orogeny from the moment that continental subduction starts to the end of orogenic collapse. Its importance in the Geological Record is greatly underestimated as collision volcanics are easily misinterpreted as being of volcanic arc, extensional or mantle plume origin. There are many types of collision volcanic province: continent-island arc collision (e.g. Banda arc); continent-active margin collision (e.g. Tibet, Turkey-Iran); continent-rear-arc collision (e.g. Bolivia); continent-continent collision (e.g. Tuscany); and island arc-island arc collision (e.g. Taiwan). Superimposed on this variability is the fact that every orogeny is different in detail. Nonetheless, there is a general theme of cyclicity on different time scales. This starts with syn-collision volcanism resulting from the subduction of an ocean-continent transition and continental lithosphere, and continues through post-collision volcanism. The latter can be subdivided into orogenic volcanism, which is related to thickened crust, and post-orogenic, which is related to orogenic collapse. Typically, but not always, collision volcanism is preceded by normal arc volcanism and followed by normal intraplate volcanism. Identification and interpretation of collision volcanism in the Geologic Record is greatly facilitated if a dated stratigraphic sequence is present so that the petrogenic evolution can be traced. In any case, the basis of fingerprinting collision terranes is to use geochemical proxies for mantle and subduction fluxes, slab temperatures, and depths and degrees of melting. For example, syn-collision volcanism is characterized by a high subduction flux relative to mantle flux because of the high input flux of fusible sediment and crust coupled with limited mantle flow, and because of high slab temperatures resulting from the decrease in subduction rate. The resulting geochemical patterns are similar regardless of
DNA statistics, overlapping word paradox and Conway equation
Pevzner, P.A.
1993-12-31
Overlapping word paradox known in combinatorics for 20 years is to this day disregarded in many papers on DNA statistics. The author considers Conway equation for the best bet for simpletons as an example of the overlapping word paradox. He gives a new short proof of Conway equation and discusses the implications of the overlapping word paradox for DNA statistics. In particular, he demonstrates that ignoring overlapping word paradox in DNA statistics can easily lead to 500% mistakes in estimations of statistical significance. He also presents formulas allowing one to find `anomalous` words in DNA texts.
Developments in Statistical Education.
ERIC Educational Resources Information Center
Kapadia, Ramesh
1980-01-01
The current status of statistics education at the secondary level is reviewed, with particular attention focused on the various instructional programs in England. A description and preliminary evaluation of the Schools Council Project on Statistical Education is included. (MP)
Mathematical and statistical analysis
NASA Technical Reports Server (NTRS)
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
Basic Interpreting Strategies for Parents.
ERIC Educational Resources Information Center
Luetke-Stahlman, Barbara
1993-01-01
Some deaf interpreting strategies are offered to parents of children who are deaf or hard of hearing. Parents are urged to utilize space in their interpreting, use name signs, utilize sight lines to distinguish characters in stories, use exaggerated signs to translate nursery rhymes, place themselves carefully at a public performance, and learn…
Remote sensing and image interpretation
NASA Technical Reports Server (NTRS)
Lillesand, T. M.; Kiefer, R. W. (Principal Investigator)
1979-01-01
A textbook prepared primarily for use in introductory courses in remote sensing is presented. Topics covered include concepts and foundations of remote sensing; elements of photographic systems; introduction to airphoto interpretation; airphoto interpretation for terrain evaluation; photogrammetry; radiometric characteristics of aerial photographs; aerial thermography; multispectral scanning and spectral pattern recognition; microwave sensing; and remote sensing from space.
Educators' Interpretations of Ambiguous Accommodations
ERIC Educational Resources Information Center
Byrnes, MaryAnn
2008-01-01
This exploratory case study examined how general and special education teachers in one school district interpreted three frequently used accommodations. Although a majority of both groups agreed on interpretations of extended time, there was little agreement, considerable variation, and some contradiction in their understanding of the changes…
Interpreting Recoil for Undergraduate Students
ERIC Educational Resources Information Center
Elsayed, Tarek A.
2012-01-01
The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is…
Social Work through An Interpreter.
ERIC Educational Resources Information Center
Baker, Nicholas G.
1981-01-01
Reviews problems related to working with non-English-speaking clients. Suggests characteristics and qualifications of a good interpreter. Recommends social workers establish a good working relationship with interpreters to effectively help clients and avoid confusion and misunderstanding resulting from cultural differences. Makes recommendations…
Statistical analysis of planetary surfaces
NASA Astrophysics Data System (ADS)
Schmidt, Frederic; Landais, Francois; Lovejoy, Shaun
2015-04-01
In the last decades, a huge amount of topographic data has been obtained by several techniques (laser and radar altimetry, DTM…) for different bodies in the solar system, including Earth, Mars, the Moon etc.. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeter to thousands of kilometers. This complexity seems to prohibit global descriptions or global topography models. Nevertheless, this topographic complexity is well-known to exhibit scaling laws that establish a similarity between scales and permit simpler descriptions and models. Indeed, efficient simulations can be made using the statistical properties of scaling fields (fractals). But realistic simulations of global topographic fields must be multi (not mono) scaling behaviour, reflecting the extreme variability and intermittency observed in real fields that can not be generated by simple scaling models. A multiscaling theory has been developed in order to model high variability and intermittency. This theory is a good statistical candidate to model the topography field with a limited number of parameters (called the multifractal parameters). In our study, we show that statistical properties of the Martian topography is accurately reproduced by this model, leading to new interpretation of geomorphological processes.
Environmental statistics and optimal regulation
NASA Astrophysics Data System (ADS)
Sivak, David; Thomson, Matt
2015-03-01
The precision with which an organism can detect its environment, and the timescale for and statistics of environmental change, will affect the suitability of different strategies for regulating protein levels in response to environmental inputs. We propose a general framework--here applied to the enzymatic regulation of metabolism in response to changing nutrient concentrations--to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, and the costs associated with enzyme production. We find: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.
Remote interpretation of chest roentgenograms.
Andrus, W S; Hunter, C H; Bird, K T
1975-04-01
A series of 98 chest films was interpreted by two physicians on the basis of monitor display of the transmitted television signal representing the roentgenographic image. The transmission path was 14 miles long, and included one active repeater station. Receiver operating characteristic curves were drawn to compare interpretations rendered on television view of the image with classic, direct view interpretations of the same films. Performance in these two viewing modes was found to be quite similar. When films containing only hazy densities lacking internal structure or sharp margins, were removed from the sample, interpretation of the remaining films was essentially identical via the two modes. Since hazy densities are visible on retrospective examination, interpretation of roentgenograms at a distance via television appears to be a feasible route for delivery of radiologic services.
Dertinger, Stephen D.; Avlasevich, Svetlana L.; Bemis, Jeffrey C.; Chen, Yuhchyau; MacGregor, James T.
2015-01-01
This laboratory has previously described a method for scoring the incidence of rodent blood Pig-a mutant phenotype erythrocytes using immunomag-netic separation in conjunction with flow cytometric analysis (In Vivo MutaFlow®). The current work extends this approach to human blood. The frequencies of CD59- and CD55-negative reticulo-cytes (RETCD59−/CD55−) and erythrocytes (RBCCD59−/CD55−) seve as phenotypic reporters of PIG-A gene mutation. Immunomagnetic separation was found to provide an effective means of increasing the number of reticulocytes and erythro-cytes evaluated. Technical replicates were utilized to provide a sufficient number of cells for precise scoring while at the same time controlling for procedural accuracy by allowing comparison of replicate values. Cold whole blood samples could be held for at least one week without affecting reticulo-cyte, RETCD59−/CD55− or RBCCD59−/CD55− frequencies. Specimens from a total of 52 nonsmoking, self-reported healthy adult subjects were evaluated. The mean frequency of RETCD59−/CD55− and RBCCD592−/CD55− were 6.0 × 10−6 and 2.9 × 10−6, respectively. The difference is consistent with a modest selective pressure against mutant phenotype erythrocytes in the circulation, and suggests advantages of studying both populations of erythrocytes. Whereas intra-subject variability was low, inter-subject variability was relatively high, with RETCD59−/CD55− frequencies differing by more than 30-fold. There was an apparent correlation between age and mutant cell frequencies. Taken together, the results indicate that the frequency of human PIG-A mutant phenotype cells can be efficiently and reliably estimated using a labeling and analysis protocol that is well established for rodent-based studies. The applicability of the assay across species, its simplicity and statistical power, and the relatively non-invasive nature of the assay should benefit myriad research areas involving DNA damage
ERIC Educational Resources Information Center
Bopp, Richard E.; Van Der Laan, Sharon J.
1985-01-01
Presents a search strategy for locating time-series or cross-sectional statistical data in published sources which was designed for undergraduate students who require 30 units of data for five separate variables in a statistical model. Instructional context and the broader applicability of the search strategy for general statistical research is…
ERIC Educational Resources Information Center
Strasser, Nora
2007-01-01
Avoiding statistical mistakes is important for educators at all levels. Basic concepts will help you to avoid making mistakes using statistics and to look at data with a critical eye. Statistical data is used at educational institutions for many purposes. It can be used to support budget requests, changes in educational philosophy, changes to…
ERIC Educational Resources Information Center
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Statistical quality management
NASA Astrophysics Data System (ADS)
Vanderlaan, Paul
1992-10-01
Some aspects of statistical quality management are discussed. Quality has to be defined as a concrete, measurable quantity. The concepts of Total Quality Management (TQM), Statistical Process Control (SPC), and inspection are explained. In most cases SPC is better than inspection. It can be concluded that statistics has great possibilities in the field of TQM.
Principles of Equilibrium Statistical Mechanics
NASA Astrophysics Data System (ADS)
Chowdhury, Debashish; Stauffer, Dietrich
2000-09-01
This modern textbook provides a complete survey of the broad field of statistical mechanics. Based on a series of lectures, it adopts a special pedagogical approach. The authors, both excellent lecturers, clearly distinguish between general principles and their applications in solving problems. Analogies between phase transitions in fluids and magnets using continuum and spin models are emphasized, leading to a better understanding. Such special features as historical notes, summaries, problems, mathematical appendix, computer programs and order of magnitude estimations distinguish this volume from competing works. Due to its ambitious level and an extensive list of references for technical details on advanced topics, this is equally a must for researchers in condensed matter physics, materials science, polymer science, solid state physics, and astrophysics. From the contents Thermostatics: phase stability, phase equilibria, phase transitions; Statistical Mechanics: calculation, correlation functions, ideal classical gases, ideal quantum gases; Interacting Systems: models, computer simulation, mean-field approximation; Interacting Systems beyond Mean-field Theory: scaling and renormalization group, foundations of statistical mechanics "The present book, however, is unique that it both is written in a very pedagogic, easily comprehensible style, and, nevertheless, goes from the basic principles all the way to these modern topics, containing several chapters on the various approaches of mean field theory, and a chapter on computer simulation. A characteristic feature of this book is that often first some qualitative arguments are given, or a "pedestrians's approach", and then a more general and/or more rigorous derivation is presented as well. Particularly useful are also "supplementary notes", pointing out interesting applications and further developments of the subject, a detailed bibliography, problems and historical notes, and many pedagogic figures."
Defining and Interpreting Suppressor Effects: Advantages and Limitations.
ERIC Educational Resources Information Center
Lancaster, Brian P.
Suppressor effects are considered one of the most elusive dynamics in the interpretation of statistical data. A suppressor variable has been defined as a predictor that has a zero correlation with the dependent variable while still, paradoxically, contributing to the predictive validity of the test battery (P. Horst, 1941). This paper explores the…
Developing and Assessing Students' Abilities To Interpret Research.
ERIC Educational Resources Information Center
Forsyth, G. Alfred; And Others
A recent conference on statistics education recommended that more emphasis be placed on the interpretation of research (IOR). Ways for developing and assessing IOR and providing a systematic framework for creating and selecting instructional materials for the independent assessment of specific IOR concepts are the focus of this paper. The…
Philosophical perspectives on quantum chaos: Models and interpretations
NASA Astrophysics Data System (ADS)
Bokulich, Alisa Nicole
2001-09-01
The problem of quantum chaos is a special case of the larger problem of understanding how the classical world emerges from quantum mechanics. While we have learned that chaos is pervasive in classical systems, it appears to be almost entirely absent in quantum systems. The aim of this dissertation is to determine what implications the interpretation of quantum mechanics has for attempts to explain the emergence of classical chaos. There are three interpretations of quantum mechanics that have set out programs for solving the problem of quantum chaos: the standard interpretation, the statistical interpretation, and the deBroglie-Bohm causal interpretation. One of the main conclusions of this dissertation is that an interpretation alone is insufficient for solving the problem of quantum chaos and that the phenomenon of decoherence must be taken into account. Although a completely satisfactory solution of the problem of quantum chaos is still outstanding, I argue that the deBroglie-Bohm interpretation with the help of decoherence outlines the most promising research program to pursue. In addition to making a contribution to the debate in the philosophy of physics concerning the interpretation of quantum mechanics, this dissertation reveals two important methodological lessons for the philosophy of science. First, issues of reductionism and intertheoretic relations cannot be divorced from questions concerning the interpretation of the theories involved. Not only is the exploration of intertheoretic relations a central part of the articulation and interpretation of an individual theory, but the very terms used to discuss intertheoretic relations, such as `state' and `classical limit', are themselves defined by particular interpretations of the theory. The second lesson that emerges is that, when it comes to characterizing the relationship between classical chaos and quantum mechanics, the traditional approaches to intertheoretic relations, namely reductionism and
What Is the next Trend in Usage Statistics in Libraries?
ERIC Educational Resources Information Center
King, Douglas
2009-01-01
In answering the question "What is the next trend in usage statistics in libraries?" an eclectic group of respondents has presented an assortment of possibilities, suggestions, complaints and, of course, questions of their own. Undoubtedly, usage statistics collection, interpretation, and application are areas of growth and increasing complexity…
ALISE Library and Information Science Education Statistical Report, 1999.
ERIC Educational Resources Information Center
Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.
This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by…
The Effect Size Statistic: Overview of Various Choices.
ERIC Educational Resources Information Center
Mahadevan, Lakshmi
Over the years, methodologists have been recommending that researchers use magnitude of effect estimates in result interpretation to highlight the distinction between statistical and practical significance (cf. R. Kirk, 1996). A magnitude of effect statistic (i.e., effect size) tells to what degree the dependent variable can be controlled,…
Evaluation of the TV Series "Statistics" (SABC-ERTV1).
ERIC Educational Resources Information Center
Stupart, J. D. C.; Duby, Aliza
A summative evaluation of the effectiveness of the educational television series, "Statistics," that aired on South African television is presented. The two episodes chosen from the six-episode series covered pie charts, pictograms, and pictographs (episode 1); and point-of-view interpretations of statistics (episode 4). The evaluation was…
Securing wide appreciation of health statistics
Pyrrait, A. M. DO Amaral; Aubenque, M. J.; Benjamin, B.; DE Groot, Meindert J. W.; Kohn, R.
1954-01-01
All the authors are agreed on the need for a certain publicizing of health statistics, but do Amaral Pyrrait points out that the medical profession prefers to convince itself rather than to be convinced. While there is great utility in articles and reviews in the professional press (especially for paramedical personnel) Aubenque, de Groot, and Kohn show how appreciation can effectively be secured by making statistics more easily understandable to the non-expert by, for instance, including readable commentaries in official publications, simplifying charts and tables, and preparing simple manuals on statistical methods. Aubenque and Kohn also stress the importance of linking health statistics to other economic and social information. Benjamin suggests that the principles of market research could to advantage be applied to health statistics to determine the precise needs of the “consumers”. At the same time, Aubenque points out that the value of the ultimate results must be clear to those who provide the data; for this, Kohn suggests that the enumerators must know exactly what is wanted and why. There is general agreement that some explanation of statistical methods and their uses should be given in the curricula of medical schools and that lectures and postgraduate courses should be arranged for practising physicians. PMID:13199668
Universal Approach for Structural Interpretation of QSAR/QSPR Models.
Polishchuk, Pavel G; Kuz'min, Victor E; Artemenko, Anatoly G; Muratov, Eugene N
2013-10-01
In this paper we offer a novel approach for the structural interpretation of QSAR models. The major advantage of our developed methodology is its universality, i.e., it can be applied to any QSAR/QSPR model irrespective of chemical descriptors and machine learning methods applied. This universality was achieved by using only the information obtained from substructures of the compounds of interest to interpret model outcomes. Reliability of the offered approach was confirmed by the results of three case studies, including end-points of different types (continuous and binary classification) and nature (solubility, mutagenicity, and inhibition of Transglutaminase 2), various fragment and whole-molecule descriptors (Simplex and Dragon), and multiple modeling techniques (partial least squares, random forest, and support vector machines). We compared the global contributions of molecular fragments obtained using our methodology with known SAR rules derived experimentally. In all cases high concordance between our interpretation and results published by others was observed. Although the proposed interpretation approach could be easily extended to any type of descriptors, we would recommend using Simplex descriptors to achieve a larger variety of investigated molecular fragments. The developed approach is a good tool for interpretation of such "black box" models like random forest, neural networks, etc. Analysis of fragment global contributions and their deviation across a dataset could be useful for the identification of key fragments and structural alerts. This information could be helpful to maximize the positive influence of structural surroundings on the given fragment and to decrease the negative effects. PMID:27480236
Universal Approach for Structural Interpretation of QSAR/QSPR Models.
Polishchuk, Pavel G; Kuz'min, Victor E; Artemenko, Anatoly G; Muratov, Eugene N
2013-10-01
In this paper we offer a novel approach for the structural interpretation of QSAR models. The major advantage of our developed methodology is its universality, i.e., it can be applied to any QSAR/QSPR model irrespective of chemical descriptors and machine learning methods applied. This universality was achieved by using only the information obtained from substructures of the compounds of interest to interpret model outcomes. Reliability of the offered approach was confirmed by the results of three case studies, including end-points of different types (continuous and binary classification) and nature (solubility, mutagenicity, and inhibition of Transglutaminase 2), various fragment and whole-molecule descriptors (Simplex and Dragon), and multiple modeling techniques (partial least squares, random forest, and support vector machines). We compared the global contributions of molecular fragments obtained using our methodology with known SAR rules derived experimentally. In all cases high concordance between our interpretation and results published by others was observed. Although the proposed interpretation approach could be easily extended to any type of descriptors, we would recommend using Simplex descriptors to achieve a larger variety of investigated molecular fragments. The developed approach is a good tool for interpretation of such "black box" models like random forest, neural networks, etc. Analysis of fragment global contributions and their deviation across a dataset could be useful for the identification of key fragments and structural alerts. This information could be helpful to maximize the positive influence of structural surroundings on the given fragment and to decrease the negative effects.
Using and interpreting diagnostic tests.
McKenna, Shawn L B; Dohoo, Ian R
2006-03-01
Diagnostic tests are invaluable to the practice of veterinary medicine. Using them correctly and interpreting the results appropriately depend on having a good understanding of the basic principles outlined in this article. Topics covered include sensitivity and specificity, agreement among tests, using multiple tests, and other issues related to the use and interpretation of diagnostic tests. The most important principle is recognition that the interpretation of test results varies across populations and requires an estimate of the prevalence of the infection (or disease) in the population being studied.
a Contextualist Interpretation of Mathematics
NASA Astrophysics Data System (ADS)
Liu, Jie
2014-03-01
The nature of mathematics has been the subject of heated debate among mathematicians and philosophers throughout the ages. The realist and anti-realist positions have had longstanding debate over this problem, but some of the most important recent development has focused on the interpretations; each of the above positions has its own interpretation of the nature of mathematics. I argue in this paper a contextualist interpretation of mathematics, it elucidates the essential features of mathematical context. That is, being integral and having concrete structure, mathematical context is a recontextualizational process with determinate boundary.
ABSTRACT: Total Petroleum hydrocarbons (TPH) as a lumped parameter can be easily and rapidly measured or monitored. Despite interpretational problems, it has become an accepted regulatory benchmark used widely to evaluate the extent of petroleum product contamination. Three cu...
ERIC Educational Resources Information Center
Sotos, Ana Elisa Castro; Vanhoof, Stijn; Van den Noortgate, Wim; Onghena, Patrick
2007-01-01
A solid understanding of "inferential statistics" is of major importance for designing and interpreting empirical results in any scientific discipline. However, students are prone to many misconceptions regarding this topic. This article structurally summarizes and describes these misconceptions by presenting a systematic review of publications…
An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics
ERIC Educational Resources Information Center
Ellis, Frank B.; Ellis, David C.
2008-01-01
Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…
Annual summary of vital statistics--1985.
Wegman, M E
1986-12-01
Data for this article, as in previous reports, are drawn principally from Monthly Vital Statistics Report, published by the National Center for Health Statistics (NCHS). The international data come from the Demographic Yearbook and the quarterly Population and Vital Statistics Reports, both published by the Statistical Office of the United Nations, which has also been kind enough to provide directly more recent data. Except for mortality data by cause and age, which are based on a 10% sample, all the US data for 1984 are estimates by place of occurrence, based upon a count of certificates received in state offices between two dates, 1 month apart, regardless of when the event occurred. Experience has shown that for the country as a whole the estimates, with few exceptions, are close to the subsequent final figures. There are, however, considerable variations in some states, particularly in comparing data by place of occurrence and place of residence. State information should be interpreted cautiously. PMID:3786054
Students' Interpretation of a Function Associated with a Real-Life Problem from Its Graph
ERIC Educational Resources Information Center
Mahir, Nevin
2010-01-01
The properties of a function such as limit, continuity, derivative, growth, or concavity can be determined more easily from its graph than by doing any algebraic operation. For this reason, it is important for students of mathematics to interpret some of the properties of a function from its graph. In this study, we investigated the competence of…
Car Troubles: An Interpretive Approach.
ERIC Educational Resources Information Center
Dawson, Leslie
1995-01-01
The growing amount of U.S. surface area being paved increases interpretive opportunities for teaching about the environmental impacts of automobiles. Provides methods and suggestions for educating high school students. Provides several computer graphics. (LZ)
Interpreting Results from Multiscore Batteries.
ERIC Educational Resources Information Center
Anastasi, Anne
1985-01-01
Describes the role of information on score reliabilities, significance of score differences, intercorrelations of scores, and differential validity of score patterns on the interpretation of results from multiscore batteries. (Author)
ENVIRONMENTAL PHOTOGRAPHIC INTERPRETATION CENTER (EPIC)
The Environmental Sciences Division (ESD) in the National Exposure Research Laboratory (NERL) of the Office of Research and Development provides remote sensing technical support including aerial photograph acquisition and interpretation to the EPA Program Offices, ORD Laboratorie...
Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine
2016-01-01
We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466
Hébert-Dufresne, Laurent; Grochow, Joshua A; Allard, Antoine
2016-01-01
We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.
Hébert-Dufresne, Laurent; Grochow, Joshua A; Allard, Antoine
2016-01-01
We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466
NASA Astrophysics Data System (ADS)
Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine
2016-08-01
We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.
Measuring statistical heterogeneity: The Pietra index
NASA Astrophysics Data System (ADS)
Eliazar, Iddo I.; Sokolov, Igor M.
2010-01-01
There are various ways of quantifying the statistical heterogeneity of a given probability law: Statistics uses variance - which measures the law’s dispersion around its mean; Physics and Information Theory use entropy - which measures the law’s randomness; Economics uses the Gini index - which measures the law’s egalitarianism. In this research we explore an alternative to the Gini index-the Pietra index-which is a counterpart of the Kolmogorov-Smirnov statistic. The Pietra index is shown to be a natural and elemental measure of statistical heterogeneity, which is especially useful in the case of asymmetric and skewed probability laws, and in the case of asymptotically Paretian laws with finite mean and infinite variance. Moreover, the Pietra index is shown to have immediate and fundamental interpretations within the following applications: renewal processes and continuous time random walks; infinite-server queueing systems and shot noise processes; financial derivatives. The interpretation of the Pietra index within the context of financial derivatives implies that derivative markets, in effect, use the Pietra index as their benchmark measure of statistical heterogeneity.
Review of robust multivariate statistical methods in high dimension.
Filzmoser, Peter; Todorov, Valentin
2011-10-31
General ideas of robust statistics, and specifically robust statistical methods for calibration and dimension reduction are discussed. The emphasis is on analyzing high-dimensional data. The discussed methods are applied using the packages chemometrics and rrcov of the statistical software environment R. It is demonstrated how the functions can be applied to real high-dimensional data from chemometrics, and how the results can be interpreted.
Interpreter services in emergency medicine.
Chan, Yu-Feng; Alagappan, Kumar; Rella, Joseph; Bentley, Suzanne; Soto-Greene, Marie; Martin, Marcus
2010-02-01
Emergency physicians are routinely confronted with problems associated with language barriers. It is important for emergency health care providers and the health system to strive for cultural competency when communicating with members of an increasingly diverse society. Possible solutions that can be implemented include appropriate staffing, use of new technology, and efforts to develop new kinds of ties to the community served. Linguistically specific solutions include professional interpretation, telephone interpretation, the use of multilingual staff members, the use of ad hoc interpreters, and, more recently, the use of mobile computer technology at the bedside. Each of these methods carries a specific set of advantages and disadvantages. Although professionally trained medical interpreters offer improved communication, improved patient satisfaction, and overall cost savings, they are often underutilized due to their perceived inefficiency and the inconclusive results of their effect on patient care outcomes. Ultimately, the best solution for each emergency department will vary depending on the population served and available resources. Access to the multiple interpretation options outlined above and solid support and commitment from hospital institutions are necessary to provide proper and culturally competent care for patients. Appropriate communications inclusive of interpreter services are essential for culturally and linguistically competent provider/health systems and overall improved patient care and satisfaction. PMID:18571358
Tannery, Nancy Hrinya; Silverman, Deborah L; Epstein, Barbara A
2002-01-01
Online use statistics can provide libraries with a tool to be used when developing an online collection of resources. Statistics can provide information on overall use of a collection, individual print and electronic journal use, and collection use by specific user populations. They can also be used to determine the number of user licenses to purchase. This paper focuses on the issue of use statistics made available for one collection of online resources.
Statistics of silicate units in binary glasses
NASA Astrophysics Data System (ADS)
Gaddam, Anuraag; Montagne, Lionel; Ferreira, José M. F.
2016-09-01
In this paper, we derive a new model to determine the distribution of silicate units in binary glasses (or liquids). The model is based on statistical mechanics and assumes grand canonical ensemble of silicate units which exchange energy and network modifiers from the reservoir. This model complements experimental techniques, which measure short range order in glasses such as nuclear magnetic resonance (NMR) spectroscopy. The model has potential in calculating the amounts of liquid-liquid phase segregation and crystal nucleation, and it can be easily extended to more complicated compositions. The structural relaxation of the glass as probed by NMR spectroscopy is also reported, where the model could find its usefulness.
Statistical Performances of Resistive Active Power Splitter
NASA Astrophysics Data System (ADS)
Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul
2016-03-01
In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.
Statistical distribution sampling
NASA Technical Reports Server (NTRS)
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
The standard map: From Boltzmann-Gibbs statistics to Tsallis statistics
NASA Astrophysics Data System (ADS)
Tirnakli, Ugur; Borges, Ernesto P.
2016-03-01
As well known, Boltzmann-Gibbs statistics is the correct way of thermostatistically approaching ergodic systems. On the other hand, nontrivial ergodicity breakdown and strong correlations typically drag the system into out-of-equilibrium states where Boltzmann-Gibbs statistics fails. For a wide class of such systems, it has been shown in recent years that the correct approach is to use Tsallis statistics instead. Here we show how the dynamics of the paradigmatic conservative (area-preserving) stan-dard map exhibits, in an exceptionally clear manner, the crossing from one statistics to the other. Our results unambiguously illustrate the domains of validity of both Boltzmann-Gibbs and Tsallis statistical distributions. Since various important physical systems from particle confinement in magnetic traps to autoionization of molecular Rydberg states, through particle dynamics in accelerators and comet dynamics, can be reduced to the standard map, our results are expected to enlighten and enable an improved interpretation of diverse experimental and observational results.
The standard map: From Boltzmann-Gibbs statistics to Tsallis statistics.
Tirnakli, Ugur; Borges, Ernesto P
2016-03-23
As well known, Boltzmann-Gibbs statistics is the correct way of thermostatistically approaching ergodic systems. On the other hand, nontrivial ergodicity breakdown and strong correlations typically drag the system into out-of-equilibrium states where Boltzmann-Gibbs statistics fails. For a wide class of such systems, it has been shown in recent years that the correct approach is to use Tsallis statistics instead. Here we show how the dynamics of the paradigmatic conservative (area-preserving) stan-dard map exhibits, in an exceptionally clear manner, the crossing from one statistics to the other. Our results unambiguously illustrate the domains of validity of both Boltzmann-Gibbs and Tsallis statistical distributions. Since various important physical systems from particle confinement in magnetic traps to autoionization of molecular Rydberg states, through particle dynamics in accelerators and comet dynamics, can be reduced to the standard map, our results are expected to enlighten and enable an improved interpretation of diverse experimental and observational results.
Intelligent Collection Environment for an Interpretation System
Maurer, W J
2001-07-19
An Intelligent Collection Environment for a data interpretation system is described. The environment accepts two inputs: A data model and a number between 0.0 and 1.0. The data model is as simple as a single word or as complex as a multi-level/multidimensional model. The number between 0.0 and 1.0 is a control knob to indicate the user's desire to allow loose matching of the data (things are ambiguous and unknown) versus strict matching of the data (things are precise and known). The environment produces a set of possible interpretations, a set of requirements to further strengthen or to differentiate a particular subset of the possible interpretation from the others, a set of inconsistencies, and a logic map that graphically shows the lines of reasoning used to derive the above output. The environment is comprised of a knowledge editor, model explorer, expertise server, and the World Wide Web. The Knowledge Editor is used by a subject matter expert to define Linguistic Types, Term Sets, detailed explanations, and dynamically created URI's, and to create rule bases using a straight forward hyper matrix representation. The Model Explorer allows rapid construction and browsing of multi-level models. A multi-level model is a model whose elements may also be models themselves. The Expertise Server is an inference engine used to interpret the data submitted. It incorporates a semantic network knowledge representation, an assumption based truth maintenance system, and a fuzzy logic calculus. It can be extended by employing any classifier (e.g. statistical/neural networks) of complex data types. The World Wide Web is an unstructured data space accessed by the URI's supplied as part of the output of the environment. By recognizing the input data model as a query, the environment serves as a deductive search engine. Applications include (but are not limited to) interpretation of geophysical phenomena, a navigation aid for very large web sites, monitoring of computer or
Admixture, Population Structure, and F-Statistics.
Peter, Benjamin M
2016-04-01
Many questions about human genetic history can be addressed by examining the patterns of shared genetic variation between sets of populations. A useful methodological framework for this purpose isF-statistics that measure shared genetic drift between sets of two, three, and four populations and can be used to test simple and complex hypotheses about admixture between populations. This article provides context from phylogenetic and population genetic theory. I review how F-statistics can be interpreted as branch lengths or paths and derive new interpretations, using coalescent theory. I further show that the admixture tests can be interpreted as testing general properties of phylogenies, allowing extension of some ideas applications to arbitrary phylogenetic trees. The new results are used to investigate the behavior of the statistics under different models of population structure and show how population substructure complicates inference. The results lead to simplified estimators in many cases, and I recommend to replace F3 with the average number of pairwise differences for estimating population divergence.
Multidimensional Visual Statistical Learning
ERIC Educational Resources Information Center
Turk-Browne, Nicholas B.; Isola, Phillip J.; Scholl, Brian J.; Treat, Teresa A.
2008-01-01
Recent studies of visual statistical learning (VSL) have demonstrated that statistical regularities in sequences of visual stimuli can be automatically extracted, even without intent or awareness. Despite much work on this topic, however, several fundamental questions remain about the nature of VSL. In particular, previous experiments have not…
Croarkin, M. Carroll
2001-01-01
For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST. PMID:27500023
Explorations in Statistics: Regression
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2011-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive connection.…
ERIC Educational Resources Information Center
Huberty, Carl J.
An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…
Reform in Statistical Education
ERIC Educational Resources Information Center
Huck, Schuyler W.
2007-01-01
Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…
Demonstrating Poisson Statistics.
ERIC Educational Resources Information Center
Vetterling, William T.
1980-01-01
Describes an apparatus that offers a very lucid demonstration of Poisson statistics as applied to electrical currents, and the manner in which such statistics account for shot noise when applied to macroscopic currents. The experiment described is intended for undergraduate physics students. (HM)
Statistical Summaries: Public Institutions.
ERIC Educational Resources Information Center
Virginia State Council of Higher Education, Richmond.
This document, presents a statistical portrait of the Virginia's 17 public higher education institutions. Data provided include: enrollment figures (broken down in categories such as sex, residency, full- and part-time status, residence, ethnicity, age, and level of postsecondary education); FTE figures; admissions statistics (such as number…
Explorations in Statistics: Power
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…
ERIC Educational Resources Information Center
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
ERIC Educational Resources Information Center
Council of Ontario Universities, Toronto.
Summary statistics on application and registration patterns of applicants wishing to pursue full-time study in first-year places in Ontario universities (for the fall of 1987) are given. Data on registrations were received indirectly from the universities as part of their annual submission of USIS/UAR enrollment data to Statistics Canada and MCU.…
Introduction to Statistical Physics
NASA Astrophysics Data System (ADS)
Casquilho, João Paulo; Ivo Cortez Teixeira, Paulo
2014-12-01
Preface; 1. Random walks; 2. Review of thermodynamics; 3. The postulates of statistical physics. Thermodynamic equilibrium; 4. Statistical thermodynamics – developments and applications; 5. The classical ideal gas; 6. The quantum ideal gas; 7. Magnetism; 8. The Ising model; 9. Liquid crystals; 10. Phase transitions and critical phenomena; 11. Irreversible processes; Appendixes; Index.
Deconstructing Statistical Analysis
ERIC Educational Resources Information Center
Snell, Joel
2014-01-01
Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…
ERIC Educational Resources Information Center
Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain
2004-01-01
Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…
Understanding Undergraduate Statistical Anxiety
ERIC Educational Resources Information Center
McKim, Courtney
2014-01-01
The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…
Explorations in Statistics: Correlation
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This sixth installment of "Explorations in Statistics" explores correlation, a familiar technique that estimates the magnitude of a straight-line relationship between two variables. Correlation is meaningful only when the…
Facade Interpretation Using a Marked Point Process
NASA Astrophysics Data System (ADS)
Wenzel, Susanne; Förstner, Wolfgang
2016-06-01
Our objective is the interpretation of facade images in a top-down manner, using a Markov marked point process formulated as a Gibbs process. Given single rectified facade images, we aim at the accurate detection of relevant facade objects as windows and entrances, using prior knowledge about their possible configurations within facade images. We represent facade objects by a simplified rectangular object model and present an energy model, which evaluates the agreement of a proposed configuration with the given image and the statistics about typical configurations, which we learned from training data. We show promising results on different datasets and provide a qualitative evaluation, which demonstrates the capability of complete and accurate detection of facade objects.
Water isotope systematics: Improving our palaeoclimate interpretations
NASA Astrophysics Data System (ADS)
Jones, M. D.; Dee, S.; Anderson, L.; Baker, A.; Bowen, G.; Noone, D. C.
2016-01-01
The stable isotopes of oxygen and hydrogen, measured in a variety of archives, are widely used proxies in Quaternary Science. Understanding the processes that control δ18O change have long been a focus of research (e.g. Shackleton and Opdyke, 1973; Talbot, 1990; Leng, 2006). Both the dynamics of water isotope cycling and the appropriate interpretation of geological water-isotope proxy time series remain subjects of active research and debate. It is clear that achieving a complete understanding of the isotope systematics for any given archive type, and ideally each individual archive, is vital if these palaeo-data are to be used to their full potential, including comparison with climate model experiments of the past. Combining information from modern monitoring and process studies, climate models, and proxy data is crucial for improving our statistical constraints on reconstructions of past climate variability.
Cerebral lateralization in simultaneous interpretation.
Fabbro, F; Gran, L; Basso, G; Bava, A
1990-07-01
Cerebral asymmetries for L1 (Italian), L2 (English), and L3 (French, German, Spanish, or Russian) were studied, by using a verbal-manual interference paradigm, in a group of Italian right-handed polyglot female students at the Scuola Superiore di Lingue Moderne per Interpreti e Traduttori (SSLM-School for Interpreters and Translators) of the University of Trieste and in a control group of right-handed monolingual female students at the Medical School of the University of Trieste. In an automatic speech production task no significant cerebral lateralization was found for the mother tongue (L1) either in the interpreting students or in the control group; the interpreting students were not significantly lateralized for the third language (L3), while weak left hemispheric lateralization was shown for L2. A significantly higher degree of verbal-manual interference was found for L1 than for L2 and L3. A significantly higher disruption rate occurred in the meaning-based mode of simultaneous interpretation (from L2 into L1 and vice versa) than in the word-for-word mode (from L2 into L1 and vice versa). No significant overall or hemispheric differences were found during simultaneous interpretation from L1 into L2 or from L2 into L1. PMID:2207622
Environmental Statistics and Optimal Regulation
2014-01-01
Any organism is embedded in an environment that changes over time. The timescale for and statistics of environmental change, the precision with which the organism can detect its environment, and the costs and benefits of particular protein expression levels all will affect the suitability of different strategies–such as constitutive expression or graded response–for regulating protein levels in response to environmental inputs. We propose a general framework–here specifically applied to the enzymatic regulation of metabolism in response to changing concentrations of a basic nutrient–to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, respectively, and the costs associated with enzyme production. We use this framework to address three fundamental questions: (i) when a cell should prefer thresholding to a graded response; (ii) when there is a fitness advantage to implementing a Bayesian decision rule; and (iii) when retaining memory of the past provides a selective advantage. We specifically find that: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones. PMID:25254493
Some easily analyzable convolutional codes
NASA Technical Reports Server (NTRS)
Mceliece, R.; Dolinar, S.; Pollara, F.; Vantilborg, H.
1989-01-01
Convolutional codes have played and will play a key role in the downlink telemetry systems on many NASA deep-space probes, including Voyager, Magellan, and Galileo. One of the chief difficulties associated with the use of convolutional codes, however, is the notorious difficulty of analyzing them. Given a convolutional code as specified, say, by its generator polynomials, it is no easy matter to say how well that code will perform on a given noisy channel. The usual first step in such an analysis is to computer the code's free distance; this can be done with an algorithm whose complexity is exponential in the code's constraint length. The second step is often to calculate the transfer function in one, two, or three variables, or at least a few terms in its power series expansion. This step is quite hard, and for many codes of relatively short constraint lengths, it can be intractable. However, a large class of convolutional codes were discovered for which the free distance can be computed by inspection, and for which there is a closed-form expression for the three-variable transfer function. Although for large constraint lengths, these codes have relatively low rates, they are nevertheless interesting and potentially useful. Furthermore, the ideas developed here to analyze these specialized codes may well extend to a much larger class.
ERIC Educational Resources Information Center
Stanly, Pat
2009-01-01
Rough patches occur at both ends of the education pipeline, as students enter community colleges and move on to work or enrollment in four-year institutions. Career pathways--sequences of coherent, articulated, and rigorous career and academic courses that lead to an industry-recognized certificate or a college degree--are a promising approach to…
A Local Interpretation of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Lopez, Carlos
2016-04-01
A local interpretation of quantum mechanics is presented. Its main ingredients are: first, a label attached to one of the "virtual" paths in the path integral formalism, determining the output for measurement of position or momentum; second, a mathematical model for spin states, equivalent to the path integral formalism for point particles in space time, with the corresponding label. The mathematical machinery of orthodox quantum mechanics is maintained, in particular amplitudes of probability and Born's rule; therefore, Bell's type inequalities theorems do not apply. It is shown that statistical correlations for pairs of particles with entangled spins have a description completely equivalent to the two slit experiment, that is, interference (wave like behaviour) instead of non locality gives account of the process. The interpretation is grounded in the experimental evidence of a point like character of electrons, and in the hypothetical existence of a wave like, the de Broglie, companion system. A correspondence between the extended Hilbert spaces of hidden physical states and the orthodox quantum mechanical Hilbert space shows the mathematical equivalence of both theories. Paradoxical behaviour with respect to the action reaction principle is analysed, and an experimental set up, modified two slit experiment, proposed to look for the companion system.
Narrative pedagogy and art interpretation.
Ewing, Bonnie; Hayden-Miles, Marie
2011-04-01
Contemporary practices in nursing education call for changes that will assist students in understanding a complex, rapidly changing world. Narrative pedagogy is an approach that offers teachers a way to actively engage students in the process of teaching and learning. The narrative approach provides ways to think critically, make connections, and ask questions to gain understanding through dialogue. The hermeneutic circle of understanding offers a way to interpret stories and discover meaning. Narratives exist in art forms that can be interpreted to evoke discussions and thinking that relate to nursing practice. Art interpretation is a way to gain access to others and acquire a deeper appreciation for multiple perspectives in the teaching-learning process.
Interpretative reports and critical values.
Piva, Elisa; Plebani, Mario
2009-06-01
In the clinical laboratory to allow an effective testing process, post-analytical activity can have two goals in trying to improve patient safety: result interpretation and communication of critical values. Both are important issues, and their success requires a cooperative effort. Misinterpretation of laboratory test results or ineffectiveness in their notification can lead to diagnostic errors or errors in identifying patient critical conditions. With the awareness that the incorrect interpretation of tests and the breakdown in the communication of critical values are preventable errors, laboratorians should make every effort to prevent the types of errors that potentially harm patients. In order to improve the reliability of laboratories, we attempt to explain how interpretative reporting and automated notification of critical values can be used to reduce errors. Clinical laboratories can therefore work to improve clinical effectiveness, without forgetting that everything should be designed to provide the best outcomes for patients.
Statistical controversies in clinical research: statistical significance-too much of a good thing ….
Buyse, M; Hurvitz, S A; Andre, F; Jiang, Z; Burris, H A; Toi, M; Eiermann, W; Lindsay, M-A; Slamon, D
2016-05-01
The use and interpretation of P values is a matter of debate in applied research. We argue that P values are useful as a pragmatic guide to interpret the results of a clinical trial, not as a strict binary boundary that separates real treatment effects from lack thereof. We illustrate our point using the result of BOLERO-1, a randomized, double-blind trial evaluating the efficacy and safety of adding everolimus to trastuzumab and paclitaxel as first-line therapy for HER2+ advanced breast cancer. In this trial, the benefit of everolimus was seen only in the predefined subset of patients with hormone receptor-negative breast cancer at baseline (progression-free survival hazard ratio = 0.66, P = 0.0049). A strict interpretation of this finding, based on complex 'alpha splitting' rules to assess statistical significance, led to the conclusion that the benefit of everolimus was not statistically significant either overall or in the subset. We contend that this interpretation does not do justice to the data, and we argue that the benefit of everolimus in hormone receptor-negative breast cancer is both statistically compelling and clinically relevant. PMID:26861602
Interpretational Confounding or Confounded Interpretations of Causal Indicators?
ERIC Educational Resources Information Center
Bainter, Sierra A.; Bollen, Kenneth A.
2014-01-01
In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning…
Default Sarcastic Interpretations: On the Priority of Nonsalient Interpretations
ERIC Educational Resources Information Center
Giora, Rachel; Drucker, Ari; Fein, Ofer; Mendelson, Itamar
2015-01-01
Findings from five experiments support the view that negation generates sarcastic utterance-interpretations by default. When presented in isolation, novel negative constructions ("Punctuality is not his forte," "Thoroughness is not her most distinctive feature"), free of semantic anomaly or internal incongruity, were…
Automatic interpretation of Schlumberger soundings
Ushijima, K.
1980-09-01
The automatic interpretation of apparent resistivity curves from horizontally layered earth models is carried out by the curve-fitting method in three steps: (1) the observed VES data are interpolated at equidistant points of electrode separations on the logarithmic scale by using the cubic spline function, (2) the layer parameters which are resistivities and depths are predicted from the sampled apparent resistivity values by SALS system program and (3) the theoretical VES curves from the models are calculated by Ghosh's linear filter method using the Zhody's computer program. Two soundings taken over Takenoyu geothermal area were chosen to test the procedures of the automatic interpretation.
Calibrated Peer Review for Interpreting Linear Regression Parameters: Results from a Graduate Course
ERIC Educational Resources Information Center
Enders, Felicity B.; Jenkins, Sarah; Hoverman, Verna
2010-01-01
Biostatistics is traditionally a difficult subject for students to learn. While the mathematical aspects are challenging, it can also be demanding for students to learn the exact language to use to correctly interpret statistical results. In particular, correctly interpreting the parameters from linear regression is both a vital tool and a…
Machtay; Glatstein
1998-01-01
have shown overall survivals superior to age-matched controls). It is fallacious and illogical to compare nonrandomized series of observation to those of aggressive therapy. In addition to the above problem, the use of DSS introduces another potential issue which we will call the bias of cause-of-death-interpretation. All statistical endpoints (e.g., response rates, local-regional control, freedom from brain metastases), except OS, are known to depend heavily on the methods used to define the endpoint and are often subject to significant interobserver variability. There is no reason to believe that this problem does not occasionally occur with respect to defining a death as due to the index cancer or to intercurrent disease, even though this issue has been poorly studied. In many oncologic situations-for example, metastatic lung cancer-this form of bias does not exist. In some situations, such as head and neck cancer, this could be an intermediate problem (Was that lethal chest tumor a second primary or a metastasis?.Would the fatal aspiration pneumonia have occurred if he still had a tongue?.And what about Mr. B. described above?). In some situations, particularly relatively "good prognosis" neoplasms, this could be a substantial problem, particularly if the adjudication of whether or not a death is cancer-related is performed solely by researchers who have an "interest" in demonstrating a good DSS. What we are most concerned about with this form of bias relates to recent series on observation, such as in early prostate cancer. It is interesting to note that although only 10% of the "observed" patients die from prostate cancer, many develop distant metastases by 10 years (approximately 40% among patients with intermediate grade tumors). Thus, it is implied that many prostate cancer metastases are usually not of themselves lethal, which is a misconception to anyone experienced in taking care of prostate cancer patients. This is inconsistent with U.S. studies of
Winters, Ryan; Winters, Andrew; Amedee, Ronald G.
2010-01-01
The Accreditation Council for Graduate Medical Education sets forth a number of required educational topics that must be addressed in residency and fellowship programs. We sought to provide a primer on some of the important basic statistical concepts to consider when examining the medical literature. It is not essential to understand the exact workings and methodology of every statistical test encountered, but it is necessary to understand selected concepts such as parametric and nonparametric tests, correlation, and numerical versus categorical data. This working knowledge will allow you to spot obvious irregularities in statistical analyses that you encounter. PMID:21603381
NASA Technical Reports Server (NTRS)
Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.
2017-01-01
Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.
Martyna, Agnieszka; Michalska, Aleksandra; Zadora, Grzegorz
2015-05-01
The problem of interpretation of common provenance of the samples within the infrared spectra database of polypropylene samples from car body parts and plastic containers as well as Raman spectra databases of blue solid and metallic automotive paints was under investigation. The research involved statistical tools such as likelihood ratio (LR) approach for expressing the evidential value of observed similarities and differences in the recorded spectra. Since the LR models can be easily proposed for databases described by a few variables, research focused on the problem of spectra dimensionality reduction characterised by more than a thousand variables. The objective of the studies was to combine the chemometric tools easily dealing with multidimensionality with an LR approach. The final variables used for LR models' construction were derived from the discrete wavelet transform (DWT) as a data dimensionality reduction technique supported by methods for variance analysis and corresponded with chemical information, i.e. typical absorption bands for polypropylene and peaks associated with pigments present in the car paints. Univariate and multivariate LR models were proposed, aiming at obtaining more information about the chemical structure of the samples. Their performance was controlled by estimating the levels of false positive and false negative answers and using the empirical cross entropy approach. The results for most of the LR models were satisfactory and enabled solving the stated comparison problems. The results prove that the variables generated from DWT preserve signal characteristic, being a sparse representation of the original signal by keeping its shape and relevant chemical information.
Martyna, Agnieszka; Michalska, Aleksandra; Zadora, Grzegorz
2015-05-01
The problem of interpretation of common provenance of the samples within the infrared spectra database of polypropylene samples from car body parts and plastic containers as well as Raman spectra databases of blue solid and metallic automotive paints was under investigation. The research involved statistical tools such as likelihood ratio (LR) approach for expressing the evidential value of observed similarities and differences in the recorded spectra. Since the LR models can be easily proposed for databases described by a few variables, research focused on the problem of spectra dimensionality reduction characterised by more than a thousand variables. The objective of the studies was to combine the chemometric tools easily dealing with multidimensionality with an LR approach. The final variables used for LR models' construction were derived from the discrete wavelet transform (DWT) as a data dimensionality reduction technique supported by methods for variance analysis and corresponded with chemical information, i.e. typical absorption bands for polypropylene and peaks associated with pigments present in the car paints. Univariate and multivariate LR models were proposed, aiming at obtaining more information about the chemical structure of the samples. Their performance was controlled by estimating the levels of false positive and false negative answers and using the empirical cross entropy approach. The results for most of the LR models were satisfactory and enabled solving the stated comparison problems. The results prove that the variables generated from DWT preserve signal characteristic, being a sparse representation of the original signal by keeping its shape and relevant chemical information. PMID:25757825
Interpreter Training Program: Program Review.
ERIC Educational Resources Information Center
Massoud, LindaLee
This report describes in detail the deaf interpreter training program offered at Mott Community College (Flint, Michigan). The program features field-based learning experiences, internships, team teaching, a field practicum, the goal of having students meet certification standards, and proficiency examinations. The program has special…
Probability Interpretation of Quantum Mechanics.
ERIC Educational Resources Information Center
Newton, Roger G.
1980-01-01
This paper draws attention to the frequency meaning of the probability concept and its implications for quantum mechanics. It emphasizes that the very meaning of probability implies the ensemble interpretation of both pure and mixed states. As a result some of the "paradoxical" aspects of quantum mechanics lose their counterintuitive character.…
Eleven Interpretations of Personal Suffering.
ERIC Educational Resources Information Center
Foley, Daniel P.
This document defines suffering as the affective aspect of the pain experience while the cognitive aspect of the pain experience is the sensation of pain. It considers personal suffering, which mean's one's own suffering, and not the suffering of other people. It notes that a particular interpretation of suffering may be formulated in any number…
Design Document. EKG Interpretation Program.
ERIC Educational Resources Information Center
Webb, Sandra M.
This teaching plan is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in acquainting students with the basic skills needed to perform electrocardiographic (ECG or EKG) interpretations. The first part of the teaching plan contains a statement of purpose; audience recommendations; a flow chart detailing…
EKG Interpretation Program. Trainers Manual.
ERIC Educational Resources Information Center
Webb, Sandra M.
This trainer's manual is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in teaching students how to make basic interpretations of their patients' electrocardiographic (EKG) strips. Included in the manual are pre- and posttests and instructional units dealing with the following topics: EKG indicators,…
Miscommunication in Interpreted Classroom Communication.
ERIC Educational Resources Information Center
Johnson, Kristen
1991-01-01
Presents a consumer's viewpoint of problems inherent in the use of interpretation to get deaf class members into the stream of vocally expressed communication, focusing on the kinds of misunderstandings that can arise when one language is expressed in the three dimensions of space and the other has only the dimensions of speech. (38 references)…
Transcendental mediatation: a psychological interpretation.
Avila, D; Nummela, R
1977-07-01
The authors suggest that Transcendental Meditation offers a great deal of promise for use in helping relationships. They also suggest that the technique might receive wider acceptance if it could be explained in other than a purely philosophical or mystical way. For that reason, in their article they offer a psychological interpretation of the TM process.
Interpretive Reproduction in Children's Play
ERIC Educational Resources Information Center
Corsaro, William A.
2012-01-01
The author looks at children's play from the perspective of interpretive reproduction, emphasizing the way children create their own unique peer cultures, which he defines as a set of routines, artifacts, values, and concerns that children engage in with their playmates. The article focuses on two types of routines in the peer culture of preschool…
Interpreting Data: The Hybrid Mind
ERIC Educational Resources Information Center
Heisterkamp, Kimberly; Talanquer, Vicente
2015-01-01
The central goal of this study was to characterize major patterns of reasoning exhibited by college chemistry students when analyzing and interpreting chemical data. Using a case study approach, we investigated how a representative student used chemical models to explain patterns in the data based on structure-property relationships. Our results…
Interpretable functional principal component analysis.
Lin, Zhenhua; Wang, Liangliang; Cao, Jiguo
2016-09-01
Functional principal component analysis (FPCA) is a popular approach to explore major sources of variation in a sample of random curves. These major sources of variation are represented by functional principal components (FPCs). The intervals where the values of FPCs are significant are interpreted as where sample curves have major variations. However, these intervals are often hard for naïve users to identify, because of the vague definition of "significant values". In this article, we develop a novel penalty-based method to derive FPCs that are only nonzero precisely in the intervals where the values of FPCs are significant, whence the derived FPCs possess better interpretability than the FPCs derived from existing methods. To compute the proposed FPCs, we devise an efficient algorithm based on projection deflation techniques. We show that the proposed interpretable FPCs are strongly consistent and asymptotically normal under mild conditions. Simulation studies confirm that with a competitive performance in explaining variations of sample curves, the proposed FPCs are more interpretable than the traditional counterparts. This advantage is demonstrated by analyzing two real datasets, namely, electroencephalography data and Canadian weather data.
Conflicting Interpretations of Scientific Pedagogy
ERIC Educational Resources Information Center
Galamba, Arthur
2016-01-01
Not surprisingly historical studies have suggested that there is a distance between concepts of teaching methods, their interpretations and their actual use in the classroom. This issue, however, is not always pitched to the personal level in historical studies, which may provide an alternative insight on how teachers conceptualise and engage with…
Art Lessons: Learning To Interpret.
ERIC Educational Resources Information Center
Carpenter, B. Stephen, II
1999-01-01
When required to interpret works of art, students arrive at a broad-based, well-grounded understanding of the nature, value, and meaning of art in their lives. Teachers should offer art works, like those of Amalia Mesa-Bains, Joseph Stella, and Beverly Buchanan, whose narratives are complex and challenging, but not conceptually dense or…
Smartberries: Interpreting Erdrich's Love Medicine
ERIC Educational Resources Information Center
Treuer, David
2005-01-01
The structure of "Love Medicines" interpreted by Hertha D. Sweet Wong who claims that the book's "multiple narrators confound conventional Western expectations of an autonomous protagonist, a dominant narrative voice, and a consistently chronological narrative". "Love Medicine" is a brilliant use of the Western literary tactics that create the…
Interpretation of the Weyl tensor
NASA Astrophysics Data System (ADS)
Hofmann, Stefan; Niedermann, Florian; Schneider, Robert
2013-09-01
According to folklore in general relativity, the Weyl tensor can be decomposed into parts corresponding to Newton-like, incoming and outgoing wavelike field components. It is shown here that this one-to-one correspondence does not hold for space-time geometries with cylindrical isometries. This is done by investigating some well-known exact solutions of Einstein’s field equations with whole-cylindrical symmetry, for which the physical interpretation is very clear, but for which the standard Weyl interpretation would give contradictory results. For planar or spherical geometries, however, the standard interpretation works for both static and dynamical space-times. It is argued that one reason for the failure in the cylindrical case is that for waves spreading in two spatial dimensions there is no local criterion to distinguish incoming and outgoing waves already at the linear level. It turns out that Thorne’s local energy notion, subject to certain qualifications, provides an efficient diagnostic tool to extract the proper physical interpretation of the space-time geometry in the case of cylindrical configurations.
Focus: Oral Interpretation and Drama.
ERIC Educational Resources Information Center
Mullican, James S., Ed.
1976-01-01
The 12 articles in this issue of "Indiana English Journal" are concerned with drama and oral interpretation in the classroom. Titles of articles are: "Up in the Tree, Down in the Cave, and Back to Reading: Creative Dramatics"; "Pantomime: The Stepping Stone to Drama"; "The Living Literature of Readers' Theatre"; "Do-It-Yourself Drama"; "Drama for…
ERIC Educational Resources Information Center
Melton, T. R.
A computer-assisted instruction system, called IT1 (Interpretive Tutor), is described which is intended to assist a student's efforts to learn the content of textual material and to evaluate his efforts toward that goal. The text is represented internally in the form of semantic networks with auxiliary structures which relate network nodes to…
Studies in Interpretation. Volume II.
ERIC Educational Resources Information Center
Doyle, Esther M., Ed.; Floyd, Virginia Hastings, Ed.
The purpose of this second book of 21 self-contained essays is the same as that of the first volume published in 1972: to bring together the scholarly theory and current research regarding oral interpretation. One third of the essays are centered on literature itself: prose fiction, poetry, and the drama. These essays discuss topics such as point…
NASA Astrophysics Data System (ADS)
Richfield, Jon; bookfeller
2016-07-01
In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.
... facts and statistics here include brain and central nervous system tumors (including spinal cord, pituitary and pineal gland ... U.S. living with a primary brain and central nervous system tumor. This year, nearly 17,000 people will ...
Titanic: A Statistical Exploration.
ERIC Educational Resources Information Center
Takis, Sandra L.
1999-01-01
Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)
NASA Astrophysics Data System (ADS)
Grégoire, G.
2016-05-01
This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...
Cooperative Learning in Statistics.
ERIC Educational Resources Information Center
Keeler, Carolyn M.; And Others
1994-01-01
Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)
Purposeful Statistical Investigations
ERIC Educational Resources Information Center
Day, Lorraine
2014-01-01
Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.
Tuberculosis Data and Statistics
... Organization Chart Advisory Groups Federal TB Task Force Data and Statistics Language: English Español (Spanish) Recommend on ... United States publication. PDF [6 MB] Interactive TB Data Tool Online Tuberculosis Information System (OTIS) OTIS is ...
Understanding Solar Flare Statistics
NASA Astrophysics Data System (ADS)
Wheatland, M. S.
2005-12-01
A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.
Which statistics should tropical biologists learn?
Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián
2011-09-01
Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.
Oakland, J.S.
1986-01-01
Addressing the increasing importance for firms to have a thorough knowledge of statistically based quality control procedures, this book presents the fundamentals of statistical process control (SPC) in a non-mathematical, practical way. It provides real-life examples and data drawn from a wide variety of industries. The foundations of good quality management and process control, and control of conformance and consistency during production are given. Offers clear guidance to those who wish to understand and implement modern SPC techniques.
NASA Astrophysics Data System (ADS)
Kardar, Mehran
2006-06-01
While many scientists are familiar with fractals, fewer are familiar with the concepts of scale-invariance and universality which underly the ubiquity of their shapes. These properties may emerge from the collective behaviour of simple fundamental constituents, and are studied using statistical field theories. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook demonstrates how such theories are formulated and studied. Perturbation theory, exact solutions, renormalization groups, and other tools are employed to demonstrate the emergence of scale invariance and universality, and the non-equilibrium dynamics of interfaces and directed paths in random media are discussed. Ideal for advanced graduate courses in statistical physics, it contains an integrated set of problems, with solutions to selected problems at the end of the book. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873413. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 65 exercises, with solutions to selected problems Features a thorough introduction to the methods of Statistical Field theory Ideal for graduate courses in Statistical Physics
Statistical Physics of Particles
NASA Astrophysics Data System (ADS)
Kardar, Mehran
2006-06-01
Statistical physics has its origins in attempts to describe the thermal properties of matter in terms of its constituent particles, and has played a fundamental role in the development of quantum mechanics. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook introduces the central concepts and tools of statistical physics. It contains a chapter on probability and related issues such as the central limit theorem and information theory, and covers interacting particles, with an extensive description of the van der Waals equation and its derivation by mean field approximation. It also contains an integrated set of problems, with solutions to selected problems at the end of the book. It will be invaluable for graduate and advanced undergraduate courses in statistical physics. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873420. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 89 exercises, with solutions to selected problems Contains chapters on probability and interacting particles Ideal for graduate courses in Statistical Mechanics
Transportation Statistics Annual Report 1997
Fenn, M.
1997-01-01
This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these
Siméon, Fabrice G.; Wendahl, Matthew T.; Pike, Victor W.
2010-01-01
2-Fluoro-1,3-thiazoles were rapidly and efficiently labeled with no-carrier-added fluorine-18 (t1/2 = 109.7 min) by treatment of readily prepared 2-halo precursors with cyclotron-produced [18F]fluoride ion. The [18F]2-fluoro-1,3-thiazolyl moiety constitutes a new and easily-labeled structural motif for prospective molecular imaging radiotracers. PMID:21057601
Selvam, Ashok
2012-01-30
Catholic Healthcare West is now rechristened Dignity Health. Freed from its formal ties with the Roman Catholic Church, it's seeking to expand east by more easily adding hospitals that may have previously been apprehensive about adopting Catholic ethical directives. "I would say our vision has not changed and neither has our mission as being a voice for the voiceless," says Lloyd Dean, left, the system's president and CEO.
Misuse of statistical test in three decades of psychotherapy research.
Dar, R; Serlin, R C; Omer, H
1994-02-01
This article reviews the misuse of statistical tests in psychotherapy research studies published in the Journal of Consulting and Clinical Psychology in the years 1967-1968, 1977-1978, and 1987-1988. It focuses on 3 major problems in statistical practice: inappropriate uses of null hypothesis tests and p values, neglect of effect size, and inflation of Type I error rate. The impressive frequency of these problems is documented, and changes in statistical practices over the past 3 decades are interpreted in light of trends in psychotherapy research. The article concludes with practical suggestions for rational application of statistical tests.
Analytical interpretation of feed-forward nets outputs after training.
Garrido, L; Gómez, S
1996-03-01
The minimization quadratic error criterion which gives rise to the back-propagation algorithm is studied using functional analysis techniques. With them, we recover easily the well-known statistical result which states that the searched global minimum is a function which assigns, to each input pattern, the expected value of its corresponding output patterns. Its application to classification tasks shows that only certain output class representations can be used to obtain the optimal Bayesian decision rule. Finally, our method permits the study of other error criterions, finding out, for instance, that absolute value errors lead to medians instead of mean values.
A new statistical tool for NOAA local climate studies
NASA Astrophysics Data System (ADS)
Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.
2011-12-01
The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially
Interpreting Sky-Averaged 21-cm Measurements
NASA Astrophysics Data System (ADS)
Mirocha, Jordan
2015-01-01
Within the first ~billion years after the Big Bang, the intergalactic medium (IGM) underwent a remarkable transformation, from a uniform sea of cold neutral hydrogen gas to a fully ionized, metal-enriched plasma. Three milestones during this epoch of reionization -- the emergence of the first stars, black holes (BHs), and full-fledged galaxies -- are expected to manifest themselves as extrema in sky-averaged ("global") measurements of the redshifted 21-cm background. However, interpreting these measurements will be complicated by the presence of strong foregrounds and non-trivialities in the radiative transfer (RT) modeling required to make robust predictions.I have developed numerical models that efficiently solve the frequency-dependent radiative transfer equation, which has led to two advances in studies of the global 21-cm signal. First, frequency-dependent solutions facilitate studies of how the global 21-cm signal may be used to constrain the detailed spectral properties of the first stars, BHs, and galaxies, rather than just the timing of their formation. And second, the speed of these calculations allows one to search vast expanses of a currently unconstrained parameter space, while simultaneously characterizing the degeneracies between parameters of interest. I find principally that (1) physical properties of the IGM, such as its temperature and ionization state, can be constrained robustly from observations of the global 21-cm signal without invoking models for the astrophysical sources themselves, (2) translating IGM properties to galaxy properties is challenging, in large part due to frequency-dependent effects. For instance, evolution in the characteristic spectrum of accreting BHs can modify the 21-cm absorption signal at levels accessible to first generation instruments, but could easily be confused with evolution in the X-ray luminosity star-formation rate relation. Finally, (3) the independent constraints most likely to aide in the interpretation
R.A. Fisher's contributions to genetical statistics.
Thompson, E A
1990-12-01
R. A. Fisher (1890-1962) was a professor of genetics, and many of his statistical innovations found expression in the development of methodology in statistical genetics. However, whereas his contributions in mathematical statistics are easily identified, in population genetics he shares his preeminence with Sewall Wright (1889-1988) and J. B. S. Haldane (1892-1965). This paper traces some of Fisher's major contributions to the foundations of statistical genetics, and his interactions with Wright and with Haldane which contributed to the development of the subject. With modern technology, both statistical methodology and genetic data are changing. Nonetheless much of Fisher's work remains relevant, and may even serve as a foundation for future research in the statistical analysis of DNA data. For Fisher's work reflects his view of the role of statistics in scientific inference, expressed in 1949: There is no wide or urgent demand for people who will define methods of proof in set theory in the name of improving mathematical statistics. There is a widespread and urgent demand for mathematicians who understand that branch of mathematics known as theoretical statistics, but who are capable also of recognising situations in the real world to which such mathematics is applicable. In recognising features of the real world to which his models and analyses should be applicable, Fisher laid a lasting foundation for statistical inference in genetic analyses. PMID:2085639
College Students' Interpretation of Research Reports on Group Differences: The Tall-Tale Effect
ERIC Educational Resources Information Center
Hogan, Thomas P.; Zaboski, Brian A.; Perry, Tiffany R.
2015-01-01
How does the student untrained in advanced statistics interpret results of research that reports a group difference? In two studies, statistically untrained college students were presented with abstracts or professional associations' reports and asked for estimates of scores obtained by the original participants in the studies. These estimates…
Statistical Physics of Fracture
Alava, Mikko; Nukala, Phani K; Zapperi, Stefano
2006-05-01
Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.
Interpreting Recoil for Undergraduate Students
NASA Astrophysics Data System (ADS)
Elsayed, Tarek A.
2012-04-01
The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is closely related to Newton's third law. Since the actual microscopic causes of recoil differ from one problem to another, some students (and teachers) may not be satisfied with understanding recoil through the principles of conservation of linear momentum and Newton's third law. For these students, the origin of the recoil motion should be presented in more depth.
Clinical Interpretation of Genomic Variations.
Sayitoğlu, Müge
2016-09-01
Novel high-throughput sequencing technologies generate large-scale genomic data and are used extensively for disease mapping of monogenic and/or complex disorders, personalized treatment, and pharmacogenomics. Next-generation sequencing is rapidly becoming routine tool for diagnosis and molecular monitoring of patients to evaluate therapeutic efficiency. The next-generation sequencing platforms generate huge amounts of genetic variation data and it remains a challenge to interpret the variations that are identified. Such data interpretation needs close collaboration among bioinformaticians, clinicians, and geneticists. There are several problems that must be addressed, such as the generation of new algorithms for mapping and annotation, harmonization of the terminology, correct use of nomenclature, reference genomes for different populations, rare disease variant databases, and clinical reports. PMID:27507302
Paleomicrobiology Data: Authentification and Interpretation.
Drancourt, Michel
2016-06-01
The authenticity of some of the very first works in the field of paleopathology has been questioned, and standards have been progressively established for the experiments and the interpretation of data. Whereas most problems initially arose from the contamination of ancient specimens with modern human DNA, the situation is different in the field of paleomicrobiology, in which the risk for contamination is well-known and adequately managed by any laboratory team with expertise in the routine diagnosis of modern-day infections. Indeed, the exploration of ancient microbiota and pathogens is best done by such laboratory teams, with research directed toward the discovery and implementation of new techniques and the interpretation of data. PMID:27337456
Modelling Metamorphism by Abstract Interpretation
NASA Astrophysics Data System (ADS)
Dalla Preda, Mila; Giacobazzi, Roberto; Debray, Saumya; Coogan, Kevin; Townsend, Gregg M.
Metamorphic malware apply semantics-preserving transformations to their own code in order to foil detection systems based on signature matching. In this paper we consider the problem of automatically extract metamorphic signatures from these malware. We introduce a semantics for self-modifying code, later called phase semantics, and prove its correctness by showing that it is an abstract interpretation of the standard trace semantics. Phase semantics precisely models the metamorphic code behavior by providing a set of traces of programs which correspond to the possible evolutions of the metamorphic code during execution. We show that metamorphic signatures can be automatically extracted by abstract interpretation of the phase semantics, and that regular metamorphism can be modelled as finite state automata abstraction of the phase semantics.
Phonological Interpretation into Preordered Algebras
NASA Astrophysics Data System (ADS)
Kubota, Yusuke; Pollard, Carl
We propose a novel architecture for categorial grammar that clarifies the relationship between semantically relevant combinatoric reasoning and semantically inert reasoning that only affects surface-oriented phonological form. To this end, we employ a level of structured phonology that mediates between syntax (abstract combinatorics) and phonology proper (strings). To notate structured phonologies, we employ a lambda calculus analogous to the φ-terms of [8]. However, unlike Oehrle's purely equational φ-calculus, our phonological calculus is inequational, in a way that is strongly analogous to the functional programming language LCF [10]. Like LCF, our phonological terms are interpreted into a Henkin frame of posets, with degree of definedness ('height' in the preorder that interprets the base type) corresponding to degree of pronounceability; only maximal elements are actual strings and therefore fully pronounceable. We illustrate with an analysis (also new) of some complex constituent-order phenomena in Japanese.
Helping Alleviate Statistical Anxiety with Computer Aided Statistical Classes
ERIC Educational Resources Information Center
Stickels, John W.; Dobbs, Rhonda R.
2007-01-01
This study, Helping Alleviate Statistical Anxiety with Computer Aided Statistics Classes, investigated whether undergraduate students' anxiety about statistics changed when statistics is taught using computers compared to the traditional method. Two groups of students were questioned concerning their anxiety about statistics. One group was taught…
Interpreting geological structure using kriging
Mao, N.
1985-07-01
We applied kriging (geostatistics) to interpret the structure of basement rock in Yucca Flat, NTS from borehole data. The estimation error for 118 data is 81 m comparable with those based on both gravity and borehole data. Using digitized topographic data, we tested the kriging results and found that the model validation process (Thomas option) on data gave a fair representation of the overall uncertainty of the kriged values. 5 refs., 6 figs., 2 tabs.
University Interpreting: Linguistic Issues for Consideration.
ERIC Educational Resources Information Center
Napier, Jemina
2002-01-01
A study investigated 10 Auslan/English interpreters' use of translation style when interpreting for a university lecture. Results found the interpreters predominantly used a free or literal interpretation approach, but switched between translation styles at particular points of a text, leading to the suggestion of the concept of translational…
What Does It Mean to Teach "Interpretively"?
ERIC Educational Resources Information Center
Dodge, Jennifer; Holtzman, Richard; van Hulst, Merlijn; Yanow, Dvora
2016-01-01
The "interpretive turn" has gained traction as a research approach in recent decades in the empirical social sciences. While the contributions of interpretive research and interpretive research methods are clear, we wonder: Does an interpretive perspective lend itself to--or even demand--a particular style of teaching? This question was…
Code of Federal Regulations, 2012 CFR
2012-01-01
... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...
Code of Federal Regulations, 2014 CFR
2014-01-01
... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...
Code of Federal Regulations, 2011 CFR
2011-01-01
... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...
Code of Federal Regulations, 2010 CFR
2010-01-01
... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...
Code of Federal Regulations, 2013 CFR
2013-01-01
... PROCEEDINGS TO DETERMINE REMOVABILITY OF ALIENS IN THE UNITED STATES Removal Proceedings § 1240.5 Interpreter. Any person acting as an interpreter in a hearing before an immigration judge under this part shall be sworn to interpret and translate accurately, unless the interpreter is an employee of the United...
Two Factors Related to Effective Voice Interpreting.
ERIC Educational Resources Information Center
Hurwitz, T. Alan
1986-01-01
Thirty-two interpreters for the deaf were measured on accuracy and quality of voice interpreting of the same story in two different sign language types: Pidgin Signed English and American Sign Language. Results indicated that previous experience interpreting was significantly related to the effectiveness of voice interpreting both languages.…
Interpreting Inexplicit Language during Courtroom Examination
ERIC Educational Resources Information Center
Lee, Jieun
2009-01-01
Court interpreters are required to provide accurate renditions of witnesses' utterances during courtroom examinations, but the accuracy of interpreting may be compromised for a number of reasons, among which is the effect on interpretation of the limited contextual information available to court interpreters. Based on the analysis of the discourse…
Consistent interpretations of quantum mechanics
Omnes, R. )
1992-04-01
Within the last decade, significant progress has been made towards a consistent and complete reformulation of the Copenhagen interpretation (an interpretation consisting in a formulation of the experimental aspects of physics in terms of the basic formalism; it is consistent if free from internal contradiction and complete if it provides precise predictions for all experiments). The main steps involved decoherence (the transition from linear superpositions of macroscopic states to a mixing), Griffiths histories describing the evolution of quantum properties, a convenient logical structure for dealing with histories, and also some progress in semiclassical physics, which was made possible by new methods. The main outcome is a theory of phenomena, viz., the classically meaningful properties of a macroscopic system. It shows in particular how and when determinism is valid. This theory can be used to give a deductive form to measurement theory, which now covers some cases that were initially devised as counterexamples against the Copenhagen interpretation. These theories are described, together with their applications to some key experiments and some of their consequences concerning epistemology.
Suite versus composite statistics
Balsillie, J.H.; Tanner, W.F.
1999-01-01
Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.
Candidate Assembly Statistical Evaluation
1998-07-15
The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less
NASA Astrophysics Data System (ADS)
Inomata, Akira
1997-03-01
To understand possible physical consequences of quantum deformation, we investigate statistical behaviors of a quon gas. The quon is an object which obeys the minimally deformed commutator (or q-mutator): a a† - q a†a=1 with -1≤ q≤ 1. Although q=1 and q=-1 appear to correspond respectively to boson and fermion statistics, it is not easy to create a gas which unifies the boson gas and the fermion gas. We present a model which is able to interpolates between the two limits. The quon gas shows the Bose-Einstein condensation near the Boson limit in two dimensions.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
Udey, Ruth Norma
2013-01-01
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
ERIC Educational Resources Information Center
Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.
2012-01-01
Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…
At 11 months, prosody still outranks statistics.
Johnson, Elizabeth K; Seidl, Amanda H
2009-01-01
English-learning 7.5-month-olds are heavily biased to perceive stressed syllables as word onsets. By 11 months, however, infants begin segmenting non-initially stressed words from speech. Using the same artificial language methodology as Johnson and Jusczyk (2001), we explored the possibility that the emergence of this ability is linked to a decreased reliance on prosodic cues to word boundaries accompanied by an increased reliance on syllable distribution cues. In a baseline study, where only statistical cues to word boundaries were present, infants exhibited a familiarity preference for statistical words. When conflicting stress cues were added to the speech stream, infants exhibited a familiarity preference for stress as opposed to statistical words. This was interpreted as evidence that 11-month-olds weight stress cues to word boundaries more heavily than statistical cues. Experiment 2 further investigated these results with a language containing convergent cues to word boundaries. The results of Experiment 2 were not conclusive. A third experiment using new stimuli and a different experimental design supported the conclusion that 11-month-olds rely more heavily on prosodic than statistical cues to word boundaries. We conclude that the emergence of the ability to segment non-initially stressed words from speech is not likely to be tied to an increased reliance on syllable distribution cues relative to stress cues, but instead may emerge due to an increased reliance on and integration of a broad array of segmentation cues. PMID:19120421
Pocock, Stuart J; McMurray, John J V; Collier, Tim J
2015-12-15
This paper tackles several statistical controversies that are commonly faced when reporting a major clinical trial. Topics covered include: multiplicity of data, interpreting secondary endpoints and composite endpoints, the value of covariate adjustment, the traumas of subgroup analysis, assessing individual benefits and risks, alternatives to analysis by intention to treat, interpreting surprise findings (good and bad), and the overall quality of clinical trial reports. All is put in the context of topical cardiology trial examples and is geared to help trialists steer a wise course in their statistical reporting, thereby giving readers a balanced account of trial findings. PMID:26670066
FIR statistics of paired galaxies
NASA Technical Reports Server (NTRS)
Sulentic, Jack W.
1990-01-01
Much progress has been made in understanding the effects of interaction on galaxies (see reviews in this volume by Heckman and Kennicutt). Evidence for enhanced emission from galaxies in pairs first emerged in the radio (Sulentic 1976) and optical (Larson and Tinsley 1978) domains. Results in the far infrared (FIR) lagged behind until the advent of the Infrared Astronomy Satellite (IRAS). The last five years have seen numerous FIR studies of optical and IR selected samples of interacting galaxies (e.g., Cutri and McAlary 1985; Joseph and Wright 1985; Kennicutt et al. 1987; Haynes and Herter 1988). Despite all of this work, there are still contradictory ideas about the level and, even, the reality of an FIR enhancement in interacting galaxies. Much of the confusion originates in differences between the galaxy samples that were studied (i.e., optical morphology and redshift coverage). Here, the authors report on a study of the FIR detection properties for a large sample of interacting galaxies and a matching control sample. They focus on the distance independent detection fraction (DF) statistics of the sample. The results prove useful in interpreting the previously published work. A clarification of the phenomenology provides valuable clues about the physics of the FIR enhancement in galaxies.
Röper, Stefanie; Franz, M Heiko; Wartchow, Rudolf; Hoffmann, H Martin R
2003-06-13
Solvolysis of C9 mesylated cinchonidine 1-OMs and cinchonine 2-OMs in solvent MeOH, EtOH, and CF(3)CH(2)OH affords ring-expanded 1-azabicyclo[3.2.2]nonanes oxygenated at carbon C3 ("second Cinchona rearrangement"). The newly introduced substituents at C3 and the neighboring quinolyl group Q' at C2 adopt quasiequatorial positions. The derived 1-azabicyclo[3.2.2]nonan-3-ones 5 and 6 are easily equilibrated. On contact with MeOD uptake of deuterium takes place at room temperature.
Statistical model with a standard Γ distribution
NASA Astrophysics Data System (ADS)
Patriarca, Marco; Chakraborti, Anirban; Kaski, Kimmo
2004-07-01
We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ . We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ . Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T(λ) , where particles exchange energy in a space with an effective dimension D(λ) .
Statistical insight: a review.
Vardell, Emily; Garcia-Barcena, Yanira
2012-01-01
Statistical Insight is a database that offers the ability to search across multiple sources of data, including the federal government, private organizations, research centers, and international intergovernmental organizations in one search. Two sample searches on the same topic, a basic and an advanced, were conducted to evaluate the database.
Pilot Class Testing: Statistics.
ERIC Educational Resources Information Center
Washington Univ., Seattle. Washington Foreign Language Program.
Statistics derived from test score data from the pilot classes participating in the Washington Foreign Language Program are presented in tables in this report. An index accompanies the tables, itemizing the classes by level (FLES, middle, and high school), grade test, language skill, and school. MLA-Coop test performances for each class were…
Statistical Reasoning over Lunch
ERIC Educational Resources Information Center
Selmer, Sarah J.; Bolyard, Johnna J.; Rye, James A.
2011-01-01
Students in the 21st century are exposed daily to a staggering amount of numerically infused media. In this era of abundant numeric data, students must be able to engage in sound statistical reasoning when making life decisions after exposure to varied information. The context of nutrition can be used to engage upper elementary and middle school…
Selected Outdoor Recreation Statistics.
ERIC Educational Resources Information Center
Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.
In this recreational information report, 96 tables are compiled from Bureau of Outdoor Recreation programs and surveys, other governmental agencies, and private sources. Eight sections comprise the document: (1) The Bureau of Outdoor Recreation, (2) Federal Assistance to Recreation, (3) Recreation Surveys for Planning, (4) Selected Statistics of…
ASURV: Astronomical SURVival Statistics
NASA Astrophysics Data System (ADS)
Feigelson, E. D.; Nelson, P. I.; Isobe, T.; LaValley, M.
2014-06-01
ASURV (Astronomical SURVival Statistics) provides astronomy survival analysis for right- and left-censored data including the maximum-likelihood Kaplan-Meier estimator and several univariate two-sample tests, bivariate correlation measures, and linear regressions. ASURV is written in FORTRAN 77, and is stand-alone and does not call any specialized libraries.
Statistics for Learning Genetics
ERIC Educational Resources Information Center
Charles, Abigail Sheena
2012-01-01
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in,…
Spitball Scatterplots in Statistics
ERIC Educational Resources Information Center
Wagaman, John C.
2012-01-01
This paper describes an active learning idea that I have used in my applied statistics class as a first lesson in correlation and regression. Students propel spitballs from various standing distances from the target and use the recorded data to determine if the spitball accuracy is associated with standing distance and review the algebra of lines…
Geopositional Statistical Methods
NASA Technical Reports Server (NTRS)
Ross, Kenton
2006-01-01
RMSE based methods distort circular error estimates (up to 50% overestimation). The empirical approach is the only statistically unbiased estimator offered. Ager modification to Shultz approach is nearly unbiased, but cumbersome. All methods hover around 20% uncertainty (@ 95% confidence) for low geopositional bias error estimates. This requires careful consideration in assessment of higher accuracy products.
ERIC Educational Resources Information Center
Akram, Muhammad; Siddiqui, Asim Jamal; Yasmeen, Farah
2004-01-01
In order to learn the concept of statistical techniques one needs to run real experiments that generate reliable data. In practice, the data from some well-defined process or system is very costly and time consuming. It is difficult to run real experiments during the teaching period in the university. To overcome these difficulties, statisticians…
Education Statistics Quarterly, 2003.
ERIC Educational Resources Information Center
Marenus, Barbara; Burns, Shelley; Fowler, William; Greene, Wilma; Knepper, Paula; Kolstad, Andrew; McMillen Seastrom, Marilyn; Scott, Leslie
2003-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…
Analogies for Understanding Statistics
ERIC Educational Resources Information Center
Hocquette, Jean-Francois
2004-01-01
This article describes a simple way to explain the limitations of statistics to scientists and students to avoid the publication of misleading conclusions. Biologists examine their results extremely critically and carefully choose the appropriate analytic methods depending on their scientific objectives. However, no such close attention is usually…
Statistical Significance Testing.
ERIC Educational Resources Information Center
McLean, James E., Ed.; Kaufman, Alan S., Ed.
1998-01-01
The controversy about the use or misuse of statistical significance testing has become the major methodological issue in educational research. This special issue contains three articles that explore the controversy, three commentaries on these articles, an overall response, and three rejoinders by the first three authors. They are: (1)…
A basic introduction to statistics for the orthopaedic surgeon.
Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef
2012-02-01
Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values. PMID:22523921
Interpretation of a compositional time series
NASA Astrophysics Data System (ADS)
Tolosana-Delgado, R.; van den Boogaart, K. G.
2012-04-01
Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA
Method of representation of remote sensing data that facilitates visual interpretation
NASA Astrophysics Data System (ADS)
Sheremetyeva, T. A.
2004-06-01
We present a method of visualization of the remote sensing data that allows a quick synthesis of heterogeneous data for its interpretation by a human operator. The method is suitable for processing pictures of one optical band as well as polyzonal and hyperspectral aero pictures. It allows using a priori knowledge for visualization. Different methods of preliminary image processing can also be easily included in the model. As a result a number of alternative visualizations of the same dataset can be obtained depending on the interpretation objectives. The method is particularly efficient at the interpretation of barely visible objects. It makes it possible to reduce the influence of particular conditions of remote probing such as lightening conditions and an optical receiver on the results of visual interpretation.
Weighted order statistic classifiers with large rank-order margin.
Porter, R. B.; Hush, D. R.; Theiler, J. P.; Gokhale, M.
2003-01-01
We describe how Stack Filters and Weighted Order Statistic function classes can be used for classification problems. This leads to a new design criteria for linear classifiers when inputs are binary-valued and weights are positive . We present a rank-based measure of margin that can be directly optimized as a standard linear program and investigate its effect on generalization error with experiment. Our approach can robustly combine large numbers of base hypothesis and easily implement known priors through regularization.
Feed analyses and their interpretation.
Hall, Mary Beth
2014-11-01
Compositional analysis is central to determining the nutritional value of feedstuffs for use in ration formulation. The utility of the values and how they should be used depends on how representative the feed subsample is, the nutritional relevance and analytical variability of the assays, and whether an analysis is suitable to be applied to a particular feedstuff. Commercial analyses presently available for carbohydrates, protein, and fats have improved nutritionally pertinent description of feed fractions. Factors affecting interpretation of feed analyses and the nutritional relevance and application of currently available analyses are discussed.
NASA Astrophysics Data System (ADS)
Maccone, C.
In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in
The Statistical Drake Equation
NASA Astrophysics Data System (ADS)
Maccone, Claudio
2010-12-01
We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density
Ducci, Daniela; de Melo, M Teresa Condesso; Preziosi, Elisabetta; Sellerino, Mariangela; Parrone, Daniele; Ribeiro, Luis
2016-11-01
The natural background level (NBL) concept is revisited and combined with indicator kriging method to analyze the spatial distribution of groundwater quality within a groundwater body (GWB). The aim is to provide a methodology to easily identify areas with the same probability of exceeding a given threshold (which may be a groundwater quality criteria, standards, or recommended limits for selected properties and constituents). Three case studies with different hydrogeological settings and located in two countries (Portugal and Italy) are used to derive NBL using the preselection method and validate the proposed methodology illustrating its main advantages over conventional statistical water quality analysis. Indicator kriging analysis was used to create probability maps of the three potential groundwater contaminants. The results clearly indicate the areas within a groundwater body that are potentially contaminated because the concentrations exceed the drinking water standards or even the local NBL, and cannot be justified by geogenic origin. The combined methodology developed facilitates the management of groundwater quality because it allows for the spatial interpretation of NBL values. PMID:27371772
Ducci, Daniela; de Melo, M Teresa Condesso; Preziosi, Elisabetta; Sellerino, Mariangela; Parrone, Daniele; Ribeiro, Luis
2016-11-01
The natural background level (NBL) concept is revisited and combined with indicator kriging method to analyze the spatial distribution of groundwater quality within a groundwater body (GWB). The aim is to provide a methodology to easily identify areas with the same probability of exceeding a given threshold (which may be a groundwater quality criteria, standards, or recommended limits for selected properties and constituents). Three case studies with different hydrogeological settings and located in two countries (Portugal and Italy) are used to derive NBL using the preselection method and validate the proposed methodology illustrating its main advantages over conventional statistical water quality analysis. Indicator kriging analysis was used to create probability maps of the three potential groundwater contaminants. The results clearly indicate the areas within a groundwater body that are potentially contaminated because the concentrations exceed the drinking water standards or even the local NBL, and cannot be justified by geogenic origin. The combined methodology developed facilitates the management of groundwater quality because it allows for the spatial interpretation of NBL values.
Statistical Physics of Hard Optimization Problems
NASA Astrophysics Data System (ADS)
Zdeborová, Lenka
2008-06-01
Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the NP-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this thesis is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.
Statistical physics of hard optimization problems
NASA Astrophysics Data System (ADS)
Zdeborová, Lenka
2009-06-01
Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.
Statistical aspects of food safety sampling.
Jongenburger, I; den Besten, H M W; Zwietering, M H
2015-01-01
In food safety management, sampling is an important tool for verifying control. Sampling by nature is a stochastic process. However, uncertainty regarding results is made even greater by the uneven distribution of microorganisms in a batch of food. This article reviews statistical aspects of sampling and describes the impact of distributions on the sampling results. Five different batch contamination scenarios are illustrated: a homogeneous batch, a heterogeneous batch with high- or low-level contamination, and a batch with localized high- or low-level contamination. These batch contamination scenarios showed that sampling results have to be interpreted carefully, especially when heterogeneous and localized contamination in food products is expected.
Statistics for the clinician: diagnostic questionnaires.
Ross, Frederick J
2011-01-01
The goal of this column is to help working clinicians understand the statistical calculations involved in interpreting the results of diagnostic questionnaires. Using the Patient Health Questionnaire-9 as an example, the author explains how to determine the most appropriate cutoffs to choose, depending on the population involved, the probabilities of error (a and {), and the expected losses associated with each kind of error in the context in which the test is being administered. (Journal of Psychiatric Practice. 2011;17:57-60). PMID:21266896
ERIC Educational Resources Information Center
van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan
2011-01-01
The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of the Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives…
Festing, Michael F. W.
2014-01-01
The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a “statistically significant” effect is real or a false positive (type I error) due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs) a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A “bootstrap” test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL) was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated. PMID:25426843
Conflicting Interpretations of Scientific Pedagogy
NASA Astrophysics Data System (ADS)
Galamba, Arthur
2016-05-01
Not surprisingly historical studies have suggested that there is a distance between concepts of teaching methods, their interpretations and their actual use in the classroom. This issue, however, is not always pitched to the personal level in historical studies, which may provide an alternative insight on how teachers conceptualise and engage with concepts of teaching methods. This article provides a case study on this level of conceptualisation by telling the story of Rómulo de Carvalho, an educator from mid-twentieth century Portugal, who for over 40 years engaged with the heuristic and Socratic methods. The overall argument is that concepts of teaching methods are open to different interpretations and are conceptualised within the melting pot of external social pressures and personal teaching preferences. The practice and thoughts of Carvalho about teaching methods are scrutinised to unveil his conflicting stances: Carvalho was a man able to question the tenets of heurism, but who publicly praised the heurism-like "discovery learning" method years later. The first part of the article contextualises the arrival of heurism in Portugal and how Carvalho attacked its philosophical tenets. In the second part, it dwells on his conflicting positions in relation to pupil-centred approaches. The article concludes with an appreciation of the embedded conflicting nature of the appropriation of concepts of teaching methods, and of Carvalho's contribution to the development of the philosophy of practical work in school science.
The interpretation of selection coefficients.
Barton, N H; Servedio, M R
2015-05-01
Evolutionary biologists have an array of powerful theoretical techniques that can accurately predict changes in the genetic composition of populations. Changes in gene frequencies and genetic associations between loci can be tracked as they respond to a wide variety of evolutionary forces. However, it is often less clear how to decompose these various forces into components that accurately reflect the underlying biology. Here, we present several issues that arise in the definition and interpretation of selection and selection coefficients, focusing on insights gained through the examination of selection coefficients in multilocus notation. Using this notation, we discuss how its flexibility-which allows different biological units to be identified as targets of selection-is reflected in the interpretation of the coefficients that the notation generates. In many situations, it can be difficult to agree on whether loci can be considered to be under "direct" versus "indirect" selection, or to quantify this selection. We present arguments for what the terms direct and indirect selection might best encompass, considering a range of issues, from viability and sexual selection to kin selection. We show how multilocus notation can discriminate between direct and indirect selection, and describe when it can do so. PMID:25790030
Smart Interpretation - Application of Machine Learning in Geological Interpretation of AEM Data
NASA Astrophysics Data System (ADS)
Bach, T.; Gulbrandsen, M. L.; Jacobsen, R.; Pallesen, T. M.; Jørgensen, F.; Høyer, A. S.; Hansen, T. M.
2015-12-01
When using airborne geophysical measurements in e.g. groundwater mapping, an overwhelming amount of data is collected. Increasingly larger survey areas, denser data collection and limited resources, combines to an increasing problem of building geological models that use all the available data in a manner that is consistent with the geologists knowledge about the geology of the survey area. In the ERGO project, funded by The Danish National Advanced Technology Foundation, we address this problem, by developing new, usable tools, enabling the geologist utilize her geological knowledge directly in the interpretation of the AEM data, and thereby handle the large amount of data, In the project we have developed the mathematical basis for capturing geological expertise in a statistical model. Based on this, we have implemented new algorithms that have been operationalized and embedded in user friendly software. In this software, the machine learning algorithm, Smart Interpretation, enables the geologist to use the system as an assistant in the geological modelling process. As the software 'learns' the geology from the geologist, the system suggest new modelling features in the data. In this presentation we demonstrate the application of the results from the ERGO project, including the proposed modelling workflow utilized on a variety of data examples.
Nock, Richard; Nielsen, Frank
2004-11-01
This paper explores a statistical basis for a process often described in computer vision: image segmentation by region merging following a particular order in the choice of regions. We exhibit a particular blend of algorithmics and statistics whose segmentation error is, as we show, limited from both the qualitative and quantitative standpoints. This approach can be efficiently approximated in linear time/space, leading to a fast segmentation algorithm tailored to processing images described using most common numerical pixel attribute spaces. The conceptual simplicity of the approach makes it simple to modify and cope with hard noise corruption, handle occlusion, authorize the control of the segmentation scale, and process unconventional data such as spherical images. Experiments on gray-level and color images, obtained with a short readily available C-code, display the quality of the segmentations obtained.
Modeling cosmic void statistics
NASA Astrophysics Data System (ADS)
Hamaus, Nico; Sutter, P. M.; Wandelt, Benjamin D.
2016-10-01
Understanding the internal structure and spatial distribution of cosmic voids is crucial when considering them as probes of cosmology. We present recent advances in modeling void density- and velocity-profiles in real space, as well as void two-point statistics in redshift space, by examining voids identified via the watershed transform in state-of-the-art ΛCDM n-body simulations and mock galaxy catalogs. The simple and universal characteristics that emerge from these statistics indicate the self-similarity of large-scale structure and suggest cosmic voids to be among the most pristine objects to consider for future studies on the nature of dark energy, dark matter and modified gravity.
Statistical evaluation of forecasts
NASA Astrophysics Data System (ADS)
Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn
2014-08-01
Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.
Data analysis using the Gnu R system for statistical computation
Simone, James; /Fermilab
2011-07-01
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
Canonical ensemble in non-extensive statistical mechanics, q > 1
NASA Astrophysics Data System (ADS)
Ruseckas, Julius
2016-09-01
The non-extensive statistical mechanics has been used to describe a variety of complex systems. The maximization of entropy, often used to introduce the non-extensive statistical mechanics, is a formal procedure and does not easily lead to physical insight. In this article we investigate the canonical ensemble in the non-extensive statistical mechanics by considering a small system interacting with a large reservoir via short-range forces and assuming equal probabilities for all available microstates. We concentrate on the situation when the reservoir is characterized by generalized entropy with non-extensivity parameter q > 1. We also investigate the problem of divergence in the non-extensive statistical mechanics occurring when q > 1 and show that there is a limit on the growth of the number of microstates of the system that is given by the same expression for all values of q.
Journey Through Statistical Mechanics
NASA Astrophysics Data System (ADS)
Yang, C. N.
2013-05-01
My first involvement with statistical mechanics and the many body problem was when I was a student at The National Southwest Associated University in Kunming during the war. At that time Professor Wang Zhu-Xi had just come back from Cambridge, England, where he was a student of Fowler, and his thesis was on phase transitions, a hot topic at that time, and still a very hot topic today...