Statistical mechanics and the ontological interpretation
NASA Astrophysics Data System (ADS)
Bohm, D.; Hiley, B. J.
1996-06-01
To complete our ontological interpretation of quantum theory we have to conclude a treatment of quantum statistical mechanics. The basic concepts in the ontological approach are the particle and the wave function. The density matrix cannot play a fundamental role here. Therefore quantum statistical mechanics will require a further statistical distribution over wave functions in addition to the distribution of particles that have a specified wave function. Ultimately the wave function of the universe will he required, but we show that if the universe in not in thermodynamic equilibrium then it can he treated in terms of weakly interacting large scale constituents that are very nearly independent of each other. In this way we obtain the same results as those of the usual approach within the framework of the ontological interpretation.
Statistical weld process monitoring with expert interpretation
Cook, G.E.; Barnett, R.J.; Strauss, A.M.; Thompson, F.M. Jr.
1996-12-31
A statistical weld process monitoring system is described. Using data of voltage, current, wire feed speed, gas flow rate, travel speed, and elapsed arc time collected while welding, the welding statistical process control (SPC) tool provides weld process quality control by implementing techniques of data trending analysis, tolerance analysis, and sequential analysis. For purposes of quality control, the control limits required for acceptance are specified in the weld procedure acceptance specifications. The control charts then provide quality assurance documentation for each weld. The statistical data trending analysis performed by the SPC program is not only valuable as a quality assurance monitoring and documentation system, it is also valuable in providing diagnostic assistance in troubleshooting equipment and material problems. Possible equipment/process problems are identified and matched with features of the SPC control charts. To aid in interpreting the voluminous statistical output generated by the SPC system, a large number of If-Then rules have been devised for providing computer-based expert advice for pinpointing problems based on out-of-limit variations of the control charts. The paper describes the SPC monitoring tool and the rule-based expert interpreter that has been developed for relating control chart trends to equipment/process problems.
NASA Astrophysics Data System (ADS)
Tadaki, Kohtaro
2010-12-01
The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.
Interpreting Educational Research Using Statistical Software.
ERIC Educational Resources Information Center
Evans, Elizabeth A.
A live demonstration of how a typical set of educational data can be examined using quantitative statistical software was conducted. The topic of tutorial support was chosen. Setting up a hypothetical research scenario, the researcher created 300 cases from random data generation adjusted to correct obvious error. Each case represented a student…
The Statistical Interpretation of Entropy: An Activity
ERIC Educational Resources Information Center
Timmberlake, Todd
2010-01-01
The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…
The Statistical Interpretation of Entropy: An Activity
NASA Astrophysics Data System (ADS)
Timmberlake, Todd
2010-11-01
The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the functioning of the second law and also provided evidence for the existence of atoms at a time when many scientists (like Ernst Mach and Wilhelm Ostwald) were skeptical.
For a statistical interpretation of Helmholtz' thermal displacement
NASA Astrophysics Data System (ADS)
Podio-Guidugli, Paolo
2016-05-01
On moving from the classic papers by Einstein and Langevin on Brownian motion, two consistent statistical interpretations are given for the thermal displacement, a scalar field formally introduced by Helmholtz, whose time derivative is by definition the absolute temperature.
The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes
ERIC Educational Resources Information Center
Cartier, Stephen F.
2011-01-01
A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…
On Interpreting Test Scores as Social Indicators: Statistical Considerations.
ERIC Educational Resources Information Center
Spencer, Bruce D.
1983-01-01
Because test scores are ordinal not cordinal attributes, the average test score often is a misleading way to summarize the scores of a group of individuals. Similarly, correlation coefficients may be misleading summary measures of association between test scores. Proper, readily interpretable, summary statistics are developed from a theory of…
Integrating statistical rock physics and sedimentology for quantitative seismic interpretation
NASA Astrophysics Data System (ADS)
Avseth, Per; Mukerji, Tapan; Mavko, Gary; Gonzalez, Ezequiel
This paper presents an integrated approach for seismic reservoir characterization that can be applied both in petroleum exploration and in hydrological subsurface analysis. We integrate fundamental concepts and models of rock physics, sedimentology, statistical pattern recognition, and information theory, with seismic inversions and geostatistics. Rock physics models enable us to link seismic amplitudes to geological facies and reservoir properties. Seismic imaging brings indirect, noninvasive, but nevertheless spatially exhaustive information about the reservoir properties that are not available from well data alone. Classification and estimation methods based on computational statistical techniques such as nonparametric Bayesian classification, Monte Carlo simulations and bootstrap, help to quantitatively measure the interpretation uncertainty and the mis-classification risk at each spatial location. Geostatistical stochastic simulations incorporate the spatial correlation and the small scale variability which is hard to capture with only seismic information because of the limits of resolution. Combining deterministic physical models with statistical techniques has provided us with a successful way of performing quantitative interpretation and estimation of reservoir properties from seismic data. These formulations identify not only the most likely interpretation but also the uncertainty of the interpretation, and serve as a guide for quantitative decision analysis. The methodology shown in this article is applied successfully to map petroleum reservoirs, and the examples are from relatively deeply buried oil fields. However, we suggest that this approach can also be carried out for improved characterization of shallow hydrologic aquifers using shallow seismic or GPR data.
Comparing survival curves using an easy to interpret statistic.
Hess, Kenneth R
2010-10-15
Here, I describe a statistic for comparing two survival curves that has a clear and obvious meaning and has a long history in biostatistics. Suppose we are comparing survival times associated with two treatments A and B. The statistic operates in such a way that if it takes on the value 0.95, then the interpretation is that a randomly chosen patient treated with A has a 95% chance of surviving longer than a randomly chosen patient treated with B. This statistic was first described in the 1950s, and was generalized in the 1960s to work with right-censored survival times. It is a useful and convenient measure for assessing differences between survival curves. Software for computing the statistic is readily available on the Internet. PMID:20732962
Use and interpretation of statistics in wildlife journals
Tacha, Thomas C.; Warde, William D.; Burnham, Kenneth P.
1982-01-01
Use and interpretation of statistics in wildlife journals are reviewed, and suggestions for improvement are offered. Populations from which inferences are to be drawn should be clearly defined, and conclusions should be limited to the range of the data analyzed. Authors should be careful to avoid improper methods of plotting data and should clearly define the use of estimates of variance, standard deviation, standard error, or confidence intervals. Biological and statistical significant are often confused by authors and readers. Statistical hypothesis testing is a tool, and not every question should be answered by hypothesis testing. Meeting assumptions of hypothesis tests is the responsibility of authors, and assumptions should be reviewed before a test is employed. The use of statistical tools should be considered carefully both before and after gathering data.
Adapting internal statistical models for interpreting visual cues to depth
Seydell, Anna; Knill, David C.; Trommershäuser, Julia
2010-01-01
The informativeness of sensory cues depends critically on statistical regularities in the environment. However, statistical regularities vary between different object categories and environments. We asked whether and how the brain changes the prior assumptions about scene statistics used to interpret visual depth cues when stimulus statistics change. Subjects judged the slants of stereoscopically presented figures by adjusting a virtual probe perpendicular to the surface. In addition to stereoscopic disparities, the aspect ratio of the stimulus in the image provided a “figural compression” cue to slant, whose reliability depends on the distribution of aspect ratios in the world. As we manipulated this distribution from regular to random and back again, subjects’ reliance on the compression cue relative to stereoscopic cues changed accordingly. When we randomly interleaved stimuli from shape categories (ellipses and diamonds) with different statistics, subjects gave less weight to the compression cue for figures from the category with more random aspect ratios. Our results demonstrate that relative cue weights vary rapidly as a function of recently experienced stimulus statistics, and that the brain can use different statistical models for different object categories. We show that subjects’ behavior is consistent with that of a broad class of Bayesian learning models. PMID:20465321
Pass-Fail Testing: Statistical Requirements and Interpretations
Gilliam, David; Leigh, Stefan; Rukhin, Andrew; Strawderman, William
2009-01-01
Performance standards for detector systems often include requirements for probability of detection and probability of false alarm at a specified level of statistical confidence. This paper reviews the accepted definitions of confidence level and of critical value. It describes the testing requirements for establishing either of these probabilities at a desired confidence level. These requirements are computable in terms of functions that are readily available in statistical software packages and general spreadsheet applications. The statistical interpretations of the critical values are discussed. A table is included for illustration, and a plot is presented showing the minimum required numbers of pass-fail tests. The results given here are applicable to one-sided testing of any system with performance characteristics conforming to a binomial distribution. PMID:27504221
Interpreting health statistics for policymaking: the story behind the headlines.
Walker, Neff; Bryce, Jennifer; Black, Robert E
2007-03-17
Politicians, policymakers, and public-health professionals make complex decisions on the basis of estimates of disease burden from different sources, many of which are "marketed" by skilled advocates. To help people who rely on such statistics make more informed decisions, we explain how health estimates are developed, and offer basic guidance on how to assess and interpret them. We describe the different levels of estimates used to quantify disease burden and its correlates; understanding how closely linked a type of statistic is to disease and death rates is crucial in designing health policies and programmes. We also suggest questions that people using such statistics should ask and offer tips to help separate advocacy from evidence-based positions. Global health agencies have a key role in communicating robust estimates of disease, as do policymakers at national and subnational levels where key public-health decisions are made. A common framework and standardised methods, building on the work of Child Health Epidemiology Reference Group (CHERG) and others, are urgently needed. PMID:17368157
Workplace statistical literacy for teachers: interpreting box plots
NASA Astrophysics Data System (ADS)
Pierce, Robyn; Chick, Helen
2013-06-01
As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the appropriate knowledge and experience to interpret the graphs, tables and other data that they receive. This study examined the statistical literacy demands placed on teachers, with a particular focus on box plot representations. Although box plots summarise the data in a way that makes visual comparisons possible across sets of data, this study showed that teachers do not always have the necessary fluency with the representation to describe correctly how the data are distributed in the representation. In particular, a significant number perceived the size of the regions of the box plot to be depicting frequencies rather than density, and there were misconceptions associated with outlying data that were not displayed on the plot. As well, teachers' perceptions of box plots were found to relate to three themes: attitudes, perceived value and misconceptions.
NASA Astrophysics Data System (ADS)
Nyblade, Andrew A.; Pollack, Henry N.
1993-03-01
We address the extent to which regional variations in continental heat flow can be interpreted, making use of a heat flow data set from east and southern Africa. The first-order observation deriving from these heat flow measurements is a common pattern characterized in both regions by low heat flow in Archean cratons and higher heat flow in younger mobile belts. Two regional differences between east and southern Africa are superimposed on this common heat flow pattern: (1) heat flow in the Tanzania Craton is about 13 mW m -2 lower than in the Kalahari Craton, and (2) heat flow in the Mozambique Belt in east Africa is about 9 mW m -2 lower than in the southern African mobile belts, within about 250 km of the respective Archean cratons. The differences in heat flow between east and southern Africa suggest that the thermal structure of the lithosphere beneath these regions differs somewhat, and we attempt to resolve these differences in lithospheric thermal structure by examining four explanations that could account for the heat flow observations: (1) diminished heat flow in shallow boreholes in east Africa; (2) less crustal heat production in the regions of lower heat flow; (3) thicker lithosphere beneath the regions of lower heat flow; (4) cooler mantle beneath the areas of lower heat flow. We find it difficult to interpret uniquely the heat flow differences between east and southern Africa because available constraints on crustal heat production, crustal structure, lithospheric thickness and mantle temperatures are insufficient to discriminate among the possible explanations. Hence, extracting significant information about lithospheric thermal structure from regional heat flow variations requires more ancillary geochemical and geophysical information than Africa presently offers.
Statistical Interpretation of Natural and Technological Hazards in China
NASA Astrophysics Data System (ADS)
Borthwick, Alistair, ,, Prof.; Ni, Jinren, ,, Prof.
2010-05-01
China is prone to catastrophic natural hazards from floods, droughts, earthquakes, storms, cyclones, landslides, epidemics, extreme temperatures, forest fires, avalanches, and even tsunami. This paper will list statistics related to the six worst natural disasters in China over the past 100 or so years, ranked according to number of fatalities. The corresponding data for the six worst natural disasters in China over the past decade will also be considered. [The data are abstracted from the International Disaster Database, Centre for Research on the Epidemiology of Disasters (CRED), Université Catholique de Louvain, Brussels, Belgium, http://www.cred.be/ where a disaster is defined as occurring if one of the following criteria is fulfilled: 10 or more people reported killed; 100 or more people reported affected; a call for international assistance; or declaration of a state of emergency.] The statistics include the number of occurrences of each type of natural disaster, the number of deaths, the number of people affected, and the cost in billions of US dollars. Over the past hundred years, the largest disasters may be related to the overabundance or scarcity of water, and to earthquake damage. However, there has been a substantial relative reduction in fatalities due to water related disasters over the past decade, even though the overall numbers of people affected remain huge, as does the economic damage. This change is largely due to the efforts put in by China's water authorities to establish effective early warning systems, the construction of engineering countermeasures for flood protection, the implementation of water pricing and other measures for reducing excessive consumption during times of drought. It should be noted that the dreadful death toll due to the Sichuan Earthquake dominates recent data. Joint research has been undertaken between the Department of Environmental Engineering at Peking University and the Department of Engineering Science at Oxford
Workplace Statistical Literacy for Teachers: Interpreting Box Plots
ERIC Educational Resources Information Center
Pierce, Robyn; Chick, Helen
2013-01-01
As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…
A novel statistical analysis and interpretation of flow cytometry data
Banks, H.T.; Kapraun, D.F.; Thompson, W. Clayton; Peligero, Cristina; Argilaguet, Jordi; Meyerhans, Andreas
2013-01-01
A recently developed class of models incorporating the cyton model of population generation structure into a conservation-based model of intracellular label dynamics is reviewed. Statistical aspects of the data collection process are quantified and incorporated into a parameter estimation scheme. This scheme is then applied to experimental data for PHA-stimulated CD4+ T and CD8+ T cells collected from two healthy donors. This novel mathematical and statistical framework is shown to form the basis for accurate, meaningful analysis of cellular behaviour for a population of cells labelled with the dye carboxyfluorescein succinimidyl ester and stimulated to divide. PMID:23826744
Statistical Interpretation of the Local Field Inside Dielectrics.
ERIC Educational Resources Information Center
Berrera, Ruben G.; Mello, P. A.
1982-01-01
Compares several derivations of the Clausius-Mossotti relation to analyze consistently the nature of approximations used and their range of applicability. Also presents a statistical-mechanical calculation of the local field for classical system of harmonic oscillators interacting via the Coulomb potential. (Author/SK)
Confounded Statistical Analyses Hinder Interpretation of the NELP Report
ERIC Educational Resources Information Center
Paris, Scott G.; Luo, Serena Wenshu
2010-01-01
The National Early Literacy Panel (2008) report identified early predictors of reading achievement as good targets for instruction, and many of those skills are related to decoding. In this article, the authors suggest that the developmental trajectories of rapidly developing skills pose problems for traditional statistical analyses. Rapidly…
Statistical characteristics of MST radar echoes and its interpretation
NASA Technical Reports Server (NTRS)
Woodman, Ronald F.
1989-01-01
Two concepts of fundamental importance are reviewed: the autocorrelation function and the frequency power spectrum. In addition, some turbulence concepts, the relationship between radar signals and atmospheric medium statistics, partial reflection, and the characteristics of noise and clutter interference are discussed.
Interpretation of gamma-ray burst source count statistics
NASA Technical Reports Server (NTRS)
Petrosian, Vahe
1993-01-01
Ever since the discovery of gamma-ray bursts, the so-called log N-log S relation has been used for determination of their distances and distribution. This task has not been straightforward because of varying thresholds for the detection of bursts. Most of the current analyses of these data are couched in terms of ambiguous distributions, such as the distribution of Cp/Clim, the ratio of peak to threshold photon count rates, or the distribution of V/Vmax = (Cp/Clim) exp -3/2. It is shown that these distributions are not always a true reflection of the log N-log S relation. Some kind of deconvolution is required for obtaining the true log N-log S. Therefore, care is required in the interpretation of results of such analyses. A new method of analysis of these data is described, whereby the bivariate distribution of Cp and Clim is obtained directly from the data.
Need for Caution in Interpreting Extreme Weather Statistics
NASA Astrophysics Data System (ADS)
Sardeshmukh, P. D.; Compo, G. P.; Penland, M. C.
2011-12-01
Given the substantial anthropogenic contribution to 20th century global warming, it is tempting to seek an anthropogenic component in any unusual recent weather event, or more generally in any observed change in the statistics of extreme weather. This study cautions that such detection and attribution efforts may, however, very likely lead to wrong conclusions if the non-Gaussian aspects of the probability distributions of observed daily atmospheric variations, especially their skewness and heavy tails, are not explicitly taken into account. Departures of three or more standard deviations from the mean, although rare, are far more common in such a non-Gaussian world than they are in a Gaussian world. This exacerbates the already difficult problem of establishing the significance of changes in extreme value probabilities from historical climate records of limited length, using either raw histograms or Generalized Extreme Value (GEV) distributions fitted to the sample extreme values. A possible solution is suggested by the fact that the non-Gaussian aspects of the observed distributions are well captured by a general class of "Stochastically Generated Skewed distributions" (SGS distributions) recently introduced in the meteorological literature by Sardeshmukh and Sura (J. Climate 2009). These distributions arise from simple modifications to a red noise process and reduce to Gaussian distributions under appropriate limits. As such, they represent perhaps the simplest physically based non-Gaussian prototypes of the distributions of daily atmospheric variations. Fitting such SGS distributions to all (not just the extreme) values in 25, 50, or 100-yr daily records also yields corresponding extreme value distributions that are much less prone to sampling uncertainty than GEV distributions. For both of the above reasons, SGS distributions provide an attractive alternative for assessing the significance of changes in extreme weather statistics (including changes in the
Interpreting the flock algorithm from a statistical perspective.
Anderson, Eric C; Barry, Patrick D
2015-09-01
We show that the algorithm in the program flock (Duchesne & Turgeon 2009) can be interpreted as an estimation procedure based on a model essentially identical to the structure (Pritchard et al. 2000) model with no admixture and without correlated allele frequency priors. Rather than using MCMC, the flock algorithm searches for the maximum a posteriori estimate of this structure model via a simulated annealing algorithm with a rapid cooling schedule (namely, the exponent on the objective function →∞). We demonstrate the similarities between the two programs in a two-step approach. First, to enable rapid batch processing of many simulated data sets, we modified the source code of structure to use the flock algorithm, producing the program flockture. With simulated data, we confirmed that results obtained with flock and flockture are very similar (though flockture is some 200 times faster). Second, we simulated multiple large data sets under varying levels of population differentiation for both microsatellite and SNP genotypes. We analysed them with flockture and structure and assessed each program on its ability to cluster individuals to their correct subpopulation. We show that flockture yields results similar to structure albeit with greater variability from run to run. flockture did perform better than structure when genotypes were composed of SNPs and differentiation was moderate (FST= 0.022-0.032). When differentiation was low, structure outperformed flockture for both marker types. On large data sets like those we simulated, it appears that flock's reliance on inference rules regarding its 'plateau record' is not helpful. Interpreting flock's algorithm as a special case of the model in structure should aid in understanding the program's output and behaviour. PMID:25913195
A statistical model for interpreting computerized dynamic posturography data
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.
2002-01-01
Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.
Impact of Equity Models and Statistical Measures on Interpretations of Educational Reform
ERIC Educational Resources Information Center
Rodriguez, Idaykis; Brewe, Eric; Sawtelle, Vashti; Kramer, Laird H.
2012-01-01
We present three models of equity and show how these, along with the statistical measures used to evaluate results, impact interpretation of equity in education reform. Equity can be defined and interpreted in many ways. Most equity education reform research strives to achieve equity by closing achievement gaps between groups. An example is given…
Statistical Interpretation of Key Comparison Reference Value and Degrees of Equivalence
Kacker, R. N.; Datla, R. U.; Parr, A. C.
2003-01-01
Key comparisons carried out by the Consultative Committees (CCs) of the International Committee of Weights and Measures (CIPM) or the Bureau International des Poids et Mesures (BIPM) are referred to as CIPM key comparisons. The outputs of a statistical analysis of the data from a CIPM key comparison are the key comparison reference value, the degrees of equivalence, and their associated uncertainties. The BIPM publications do not discuss statistical interpretation of these outputs. We discuss their interpretation under the following three statistical models: nonexistent laboratory-effects model, random laboratory-effects model, and systematic laboratory-effects model.
NASA Astrophysics Data System (ADS)
Kuić, Domagoj
2016-05-01
In this paper an alternative approach to statistical mechanics based on the maximum information entropy principle (MaxEnt) is examined, specifically its close relation with the Gibbs method of ensembles. It is shown that the MaxEnt formalism is the logical extension of the Gibbs formalism of equilibrium statistical mechanics that is entirely independent of the frequentist interpretation of probabilities only as factual (i.e. experimentally verifiable) properties of the real world. Furthermore, we show that, consistently with the law of large numbers, the relative frequencies of the ensemble of systems prepared under identical conditions (i.e. identical constraints) actually correspond to the MaxEnt probabilites in the limit of a large number of systems in the ensemble. This result implies that the probabilities in statistical mechanics can be interpreted, independently of the frequency interpretation, on the basis of the maximum information entropy principle.
Statistical Tools for the Interpretation of Enzootic West Nile virus Transmission Dynamics.
Caillouët, Kevin A; Robertson, Suzanne
2016-01-01
Interpretation of enzootic West Nile virus (WNV) surveillance indicators requires little advanced mathematical skill, but greatly enhances the ability of public health officials to prescribe effective WNV management tactics. Stepwise procedures for the calculation of mosquito infection rates (IR) and vector index (VI) are presented alongside statistical tools that require additional computation. A brief review of advantages and important considerations for each statistic's use is provided. PMID:27188561
The broad topic of biomarker research has an often-overlooked component: the documentation and interpretation of the surrounding chemical environment and other meta-data, especially from visualization, analytical, and statistical perspectives (Pleil et al. 2014; Sobus et al. 2011...
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
ERIC Educational Resources Information Center
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate unless "corrected" effect…
Dotto, G L; Pinto, L A A; Hachicha, M A; Knani, S
2015-03-15
In this work, statistical physics treatment was employed to study the adsorption of food dyes onto chitosan films, in order to obtain new physicochemical interpretations at molecular level. Experimental equilibrium curves were obtained for the adsorption of four dyes (FD&C red 2, FD&C yellow 5, FD&C blue 2, Acid Red 51) at different temperatures (298, 313 and 328 K). A statistical physics formula was used to interpret these curves, and the parameters such as, number of adsorbed dye molecules per site (n), anchorage number (n'), receptor sites density (NM), adsorbed quantity at saturation (N asat), steric hindrance (τ), concentration at half saturation (c1/2) and molar adsorption energy (ΔE(a)) were estimated. The relation of the above mentioned parameters with the chemical structure of the dyes and temperature was evaluated and interpreted. PMID:25308634
On the Interpretation of Running Trends as Summary Statistics for Time Series Analysis
NASA Astrophysics Data System (ADS)
Vigo, Isabel M.; Trottini, Mario; Belda, Santiago
2016-04-01
In recent years, running trends analysis (RTA) has been widely used in climate applied research as summary statistics for time series analysis. There is no doubt that RTA might be a useful descriptive tool, but despite its general use in applied research, precisely what it reveals about the underlying time series is unclear and, as a result, its interpretation is unclear too. This work contributes to such interpretation in two ways: 1) an explicit formula is obtained for the set of time series with a given series of running trends, making it possible to show that running trends, alone, perform very poorly as summary statistics for time series analysis; and 2) an equivalence is established between RTA and the estimation of a (possibly nonlinear) trend component of the underlying time series using a weighted moving average filter. Such equivalence provides a solid ground for RTA implementation and interpretation/validation.
Statistical issues in the design, analysis and interpretation of animal carcinogenicity studies.
Haseman, J K
1984-01-01
Statistical issues in the design, analysis and interpretation of animal carcinogenicity studies are discussed. In the area of experimental design, issues that must be considered include randomization of animals, sample size considerations, dose selection and allocation of animals to experimental groups, and control of potentially confounding factors. In the analysis of tumor incidence data, survival differences among groups should be taken into account. It is important to try to distinguish between tumors that contribute to the death of the animal and "incidental" tumors discovered at autopsy in an animal dying of an unrelated cause. Life table analyses (appropriate for lethal tumors) and incidental tumor tests (appropriate for nonfatal tumors) are described, and the utilization of these procedures by the National Toxicology Program is discussed. Despite the fact that past interpretations of carcinogenicity data have tended to focus on pairwise comparisons in general and high-dose effects in particular, the importance of trend tests should not be overlooked, since these procedures are more sensitive than pairwise comparisons to the detection of carcinogenic effects. No rigid statistical "decision rule" should be employed in the interpretation of carcinogenicity data. Although the statistical significance of an observed tumor increase is perhaps the single most important piece of evidence used in the evaluation process, a number of biological factors must also be taken into account. The use of historical control data, the false-positive issue and the interpretation of negative trends are also discussed. PMID:6525993
de Irala, J; Fernandez-Crehuet Navajas, R; Serrano del Castillo, A
1997-03-01
This study describes the behavior of eight statistical programs (BMDP, EGRET, JMP, SAS, SPSS, STATA, STATISTIX, and SYSTAT) when performing a logistic regression with a simulated data set that contains a numerical problem created by the presence of a cell value equal to zero. The programs respond in different ways to this problem. Most of them give a warning, although many simultaneously present incorrect results, among which are confidence intervals that tend toward infinity. Such results can mislead the user. Various guidelines are offered for detecting these problems in actual analyses, and users are reminded of the importance of critical interpretation of the results of statistical programs. PMID:9162592
Impact of equity models and statistical measures on interpretations of educational reform
NASA Astrophysics Data System (ADS)
Rodriguez, Idaykis; Brewe, Eric; Sawtelle, Vashti; Kramer, Laird H.
2012-12-01
We present three models of equity and show how these, along with the statistical measures used to evaluate results, impact interpretation of equity in education reform. Equity can be defined and interpreted in many ways. Most equity education reform research strives to achieve equity by closing achievement gaps between groups. An example is given by the study by Lorenzo et al. that shows that interactive engagement methods lead to increased gender equity. In this paper, we reexamine the results of Lorenzo et al. through three models of equity. We find that interpretation of the results strongly depends on the model of equity chosen. Further, we argue that researchers must explicitly state their model of equity as well as use effect size measurements to promote clarity in education reform.
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
NASA Astrophysics Data System (ADS)
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
NASA Technical Reports Server (NTRS)
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
Misuse of statistics in the interpretation of data on low-level radiation
Hamilton, L.D.
1982-01-01
Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.
Two Easily Made Astronomical Telescopes.
ERIC Educational Resources Information Center
Hill, M.; Jacobs, D. J.
1991-01-01
The directions and diagrams for making a reflecting telescope and a refracting telescope are presented. These telescopes can be made by students out of plumbing parts and easily obtainable, inexpensive, optical components. (KR)
Soil VisNIR chemometric performance statistics should be interpreted as random variables
NASA Astrophysics Data System (ADS)
Brown, David J.; Gasch, Caley K.; Poggio, Matteo; Morgan, Cristine L. S.
2015-04-01
Chemometric models are normally evaluated using performance statistics such as the Standard Error of Prediction (SEP) or the Root Mean Squared Error of Prediction (RMSEP). These statistics are used to evaluate the quality of chemometric models relative to other published work on a specific soil property or to compare the results from different processing and modeling techniques (e.g. Partial Least Squares Regression or PLSR and random forest algorithms). Claims are commonly made about the overall success of an application or the relative performance of different modeling approaches assuming that these performance statistics are fixed population parameters. While most researchers would acknowledge that small differences in performance statistics are not important, rarely are performance statistics treated as random variables. Given that we are usually comparing modeling approaches for general application, and given that the intent of VisNIR soil spectroscopy is to apply chemometric calibrations to larger populations than are included in our soil-spectral datasets, it is more appropriate to think of performance statistics as random variables with variation introduced through the selection of samples for inclusion in a given study and through the division of samples into calibration and validation sets (including spiking approaches). Here we look at the variation in VisNIR performance statistics for the following soil-spectra datasets: (1) a diverse US Soil Survey soil-spectral library with 3768 samples from all 50 states and 36 different countries; (2) 389 surface and subsoil samples taken from US Geological Survey continental transects; (3) the Texas Soil Spectral Library (TSSL) with 3000 samples; (4) intact soil core scans of Texas soils with 700 samples; (5) approximately 400 in situ scans from the Pacific Northwest region; and (6) miscellaneous local datasets. We find the variation in performance statistics to be surprisingly large. This has important
NASA Astrophysics Data System (ADS)
Jha, Sanjeev Kumar; Comunian, Alessandro; Mariethoz, Gregoire; Kelly, Bryce F. J.
2014-10-01
We develop a stochastic approach to construct channelized 3-D geological models constrained to borehole measurements as well as geological interpretation. The methodology is based on simple 2-D geologist-provided sketches of fluvial depositional elements, which are extruded in the 3rd dimension. Multiple-point geostatistics (MPS) is used to impair horizontal variability to the structures by introducing geometrical transformation parameters. The sketches provided by the geologist are used as elementary training images, whose statistical information is expanded through randomized transformations. We demonstrate the applicability of the approach by applying it to modeling a fluvial valley filling sequence in the Maules Creek catchment, Australia. The facies models are constrained to borehole logs, spatial information borrowed from an analogue and local orientations derived from the present-day stream networks. The connectivity in the 3-D facies models is evaluated using statistical measures and transport simulations. Comparison with a statistically equivalent variogram-based model shows that our approach is more suited for building 3-D facies models that contain structures specific to the channelized environment and which have a significant influence on the transport processes.
NASA Astrophysics Data System (ADS)
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Shafieloo, Arman
2012-05-01
By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties.
Parameter Interpretation and Reduction for a Unified Statistical Mechanical Surface Tension Model.
Boyer, Hallie; Wexler, Anthony; Dutcher, Cari S
2015-09-01
Surface properties of aqueous solutions are important for environments as diverse as atmospheric aerosols and biocellular membranes. Previously, we developed a surface tension model for both electrolyte and nonelectrolyte aqueous solutions across the entire solute concentration range (Wexler and Dutcher, J. Phys. Chem. Lett. 2013, 4, 1723-1726). The model differentiated between adsorption of solute molecules in the bulk and surface of solution using the statistical mechanics of multilayer sorption solution model of Dutcher et al. (J. Phys. Chem. A 2013, 117, 3198-3213). The parameters in the model had physicochemical interpretations, but remained largely empirical. In the current work, these parameters are related to solute molecular properties in aqueous solutions. For nonelectrolytes, sorption tendencies suggest a strong relation with molecular size and functional group spacing. For electrolytes, surface adsorption of ions follows ion surface-bulk partitioning calculations by Pegram and Record (J. Phys. Chem. B 2007, 111, 5411-5417). PMID:26275040
Barber, Chris; Cayley, Alex; Hanser, Thierry; Harding, Alex; Heghes, Crina; Vessey, Jonathan D; Werner, Stephane; Weiner, Sandy K; Wichard, Joerg; Giddings, Amanda; Glowienke, Susanne; Parenty, Alexis; Brigo, Alessandro; Spirkl, Hans-Peter; Amberg, Alexander; Kemper, Ray; Greene, Nigel
2016-04-01
The relative wealth of bacterial mutagenicity data available in the public literature means that in silico quantitative/qualitative structure activity relationship (QSAR) systems can readily be built for this endpoint. A good means of evaluating the performance of such systems is to use private unpublished data sets, which generally represent a more distinct chemical space than publicly available test sets and, as a result, provide a greater challenge to the model. However, raw performance metrics should not be the only factor considered when judging this type of software since expert interpretation of the results obtained may allow for further improvements in predictivity. Enough information should be provided by a QSAR to allow the user to make general, scientifically-based arguments in order to assess and overrule predictions when necessary. With all this in mind, we sought to validate the performance of the statistics-based in vitro bacterial mutagenicity prediction system Sarah Nexus (version 1.1) against private test data sets supplied by nine different pharmaceutical companies. The results of these evaluations were then analysed in order to identify findings presented by the model which would be useful for the user to take into consideration when interpreting the results and making their final decision about the mutagenic potential of a given compound. PMID:26708083
NASA Technical Reports Server (NTRS)
Dean, W. T.; Stringer, E. J.
1979-01-01
Crimp-type connectors reduce assembly and disassembly time. With design, no switch preparation is necessary and socket contracts are crimped to wires inserted in module attached to back of toggle switch engaging pins inside module to make electrical connections. Wires are easily removed with standard detachment tool. Design can accommodate wires of any gage and as many terminals can be placed on switch as wire gage and switch dimensions will allow.
Design of easily testable systems
Rawat, S.S.
1988-01-01
This thesis presents structured testability techniques that can be applied to systolic arrays. Systolic arrays for signal processing have produced processing rates far in excess of general-purpose architecture. Fast testing is considered as one of the design criteria. The main goal is to derive test vectors for one- and two-dimensional systolic arrays. The author seeks to keep the number of test vectors independent of the size of the array under a generic fault model. The testable design is based on pseudo-exhaustive testing. Conventional testing uses Level Sensitive Scan Detection (LSSD) techniques which are very time consuming for an array of systolic processors. By making the testability analysis early the logic designer will be able to make early (and repeated) design trade-offs that make design for testability a simple extension of the design process. The author shows how one-dimensional sequential systolic arrays can be designed so that the faults can be easily detected and isolated. He also considers unilateral two-dimensional sequential arrays and suggests modifications to make them easily testable. Finally, he shows how a modified carry look ahead adder of arbitrary size can be tested with just 136 test vectors. Comparisons are made against the standard LSSD technique.
ERIC Educational Resources Information Center
Boysen, Guy A.
2015-01-01
Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…
Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan
2014-03-15
Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has
A Decision Tree Approach to the Interpretation of Multivariate Statistical Techniques.
ERIC Educational Resources Information Center
Fok, Lillian Y.; And Others
1995-01-01
Discusses the nature, power, and limitations of four multivariate techniques: factor analysis, multiple analysis of variance, multiple regression, and multiple discriminant analysis. Shows how decision trees assist in interpreting results. (SK)
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Boyle temperature as a point of ideal gas in gentile statistics and its economic interpretation
NASA Astrophysics Data System (ADS)
Maslov, V. P.; Maslova, T. V.
2014-07-01
Boyle temperature is interpreted as the temperature at which the formation of dimers becomes impossible. To Irving Fisher's correspondence principle we assign two more quantities: the number of degrees of freedom, and credit. We determine the danger level of the mass of money M when the mutual trust between economic agents begins to fall.
2014-01-01
Background A new algorithm has been developed to enable the interpretation of black box models. The developed algorithm is agnostic to learning algorithm and open to all structural based descriptors such as fragments, keys and hashed fingerprints. The algorithm has provided meaningful interpretation of Ames mutagenicity predictions from both random forest and support vector machine models built on a variety of structural fingerprints. A fragmentation algorithm is utilised to investigate the model’s behaviour on specific substructures present in the query. An output is formulated summarising causes of activation and deactivation. The algorithm is able to identify multiple causes of activation or deactivation in addition to identifying localised deactivations where the prediction for the query is active overall. No loss in performance is seen as there is no change in the prediction; the interpretation is produced directly on the model’s behaviour for the specific query. Results Models have been built using multiple learning algorithms including support vector machine and random forest. The models were built on public Ames mutagenicity data and a variety of fingerprint descriptors were used. These models produced a good performance in both internal and external validation with accuracies around 82%. The models were used to evaluate the interpretation algorithm. Interpretation was revealed that links closely with understood mechanisms for Ames mutagenicity. Conclusion This methodology allows for a greater utilisation of the predictions made by black box models and can expedite further study based on the output for a (quantitative) structure activity model. Additionally the algorithm could be utilised for chemical dataset investigation and knowledge extraction/human SAR development. PMID:24661325
MacKinnon, David P.; Pirlott, Angela G.
2016-01-01
Statistical mediation methods provide valuable information about underlying mediating psychological processes, but the ability to infer that the mediator variable causes the outcome variable is more complex than widely known. Researchers have recently emphasized how violating assumptions about confounder bias severely limits causal inference of the mediator to dependent variable relation. Our article describes and addresses these limitations by drawing on new statistical developments in causal mediation analysis. We first review the assumptions underlying causal inference and discuss three ways to examine the effects of confounder bias when assumptions are violated. We then describe four approaches to address the influence of confounding variables and enhance causal inference, including comprehensive structural equation models, instrumental variable methods, principal stratification, and inverse probability weighting. Our goal is to further the adoption of statistical methods to enhance causal inference in mediation studies. PMID:25063043
Statistics Translated: A Step-by-Step Guide to Analyzing and Interpreting Data
ERIC Educational Resources Information Center
Terrell, Steven R.
2012-01-01
Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying…
Patton, Charles J.; Gilroy, Edward J.
1999-01-01
Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.
Eide, I; Zahlsen, K
1996-01-01
The paper describes experimental and statistical methods for toxicokinetic evaluation of mixtures in inhalation experiments. Synthetic mixtures of three C9 n-paraffinic, naphthenic and aromatic hydrocarbons (n-nonane, trimethylcyclohexane and trimethylbenzene, respectively) were studied in the rat after inhalation for 12h. The hydrocarbons were mixed according to principles for statistical experimental design using mixture design at four vapour levels (75, 150, 300 and 450 ppm) to support an empirical model with linear, interaction and quadratic terms (Taylor polynome). Immediately after exposure, concentrations of hydrocarbons were measured by head space gas chromatography in blood, brain, liver, kidneys and perirenal fat. Multivariate data analysis and modelling were performed with PLS (projections to latent structures). The best models were obtained after removing all interaction terms, suggesting that there were no interactions between the hydrocarbons with respect to absorption and distribution. Uptake of paraffins and particularly aromatics is best described by quadratic models, whereas the uptake of the naphthenic hydrocarbons is nearly linear. All models are good, with high correlation (r2) and prediction properties (Q2), the latter after cross validation. The concentrations of aromates in blood were high compared to the other hydrocarbons. At concentrations below 250 ppm, the naphthene reached higher concentrations in the brain compared to the paraffin and the aromate. Statistical experimental design, multivariate data analysis and modelling have proved useful for the evaluation of synthetic mixtures. The principles may also be used in the design of liquid mixtures, which may be evaporated partially or completely. PMID:8740533
Logical, epistemological and statistical aspects of nature-nurture data interpretation.
Kempthorne, O
1978-03-01
In this paper the nature of the reasoning processes applied to the nature-nurture question is discussed in general and with particular reference to mental and behavioral traits. The nature of data analysis and analysis of variance is discussed. Necessarily, the nature of causation is considered. The notion that mere data analysis can establish "real" causation is attacked. Logic of quantitative genetic theory is reviewed briefly. The idea that heritability is meaningful in the human mental and behavioral arena is attacked. The conclusion is that the heredity-IQ controversy has been a "tale full of sound and fury, signifying nothing". To suppose that one can establish effects of an intervention process when it does not occur in the data is plainly ludicrous. Mere observational studies can easily lead to stupidities, and it is suggested that this has happened in the heredity-IQ arena. The idea that there are racial-genetic differences in mental abilities and behavioral traits of humans is, at best, no more than idle speculation. PMID:637918
Statistical interpretation of joint multiplicity distributions of neutrons and charged particles
NASA Astrophysics Data System (ADS)
Tõke, J.; Agnihotri, D. K.; Skulski, W.; Schröder, W. U.
2001-02-01
Experimental joint multiplicity distributions of neutrons and charged particles provide a striking signal of the characteristic decay processes of nuclear systems following energetic nuclear reactions. They present, therefore, a valuable tool for testing theoretical models for such decay processes. The power of this experimental tool is demonstrated by a comparison of an experimental joint multiplicity distribution to the predictions of different theoretical models of statistical decay of excited nuclear systems. It is shown that, while generally phase-space based models offer a quantitative description of the observed correlation pattern of such an experimental multiplicity distribution, some models of nuclear multifragmentation fail to account for salient features of the observed correlation.
Double precision errors in the logistic map: statistical study and dynamical interpretation.
Oteo, J A; Ros, J
2007-09-01
The nature of the round-off errors that occur in the usual double precision computation of the logistic map is studied in detail. Different iterative regimes from the whole panoply of behaviors exhibited in the bifurcation diagram are examined, histograms of errors in trajectories given, and for the case of fully developed chaos an explicit formula is found. It is shown that the statistics of the largest double precision error as a function of the map parameter is characterized by jumps whose location is determined by certain boundary crossings in the bifurcation diagram. Both jumps and locations seem to present geometric convergence characterized by the two first Feigenbaum constants. Eventually, a comparison with Benford's law for the distribution of the leading digit of compilation of numbers is discussed. PMID:17930330
Rapp, J.B.
1991-01-01
Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
NASA Astrophysics Data System (ADS)
Sibatov, R. T.
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements
Keita Yoshioka; Pinan Dawkrajai; Analis A. Romero; Ding Zhu; A. D. Hill; Larry W. Lake
2007-01-15
With the recent development of temperature measurement systems, continuous temperature profiles can be obtained with high precision. Small temperature changes can be detected by modern temperature measuring instruments such as fiber optic distributed temperature sensor (DTS) in intelligent completions and will potentially aid the diagnosis of downhole flow conditions. In vertical wells, since elevational geothermal changes make the wellbore temperature sensitive to the amount and the type of fluids produced, temperature logs can be used successfully to diagnose the downhole flow conditions. However, geothermal temperature changes along the wellbore being small for horizontal wells, interpretations of a temperature log become difficult. The primary temperature differences for each phase (oil, water, and gas) are caused by frictional effects. Therefore, in developing a thermal model for horizontal wellbore, subtle temperature changes must be accounted for. In this project, we have rigorously derived governing equations for a producing horizontal wellbore and developed a prediction model of the temperature and pressure by coupling the wellbore and reservoir equations. Also, we applied Ramey's model (1962) to the build section and used an energy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases at varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section. With the prediction models developed, we present inversion studies of synthetic and field examples. These results are essential to identify water or gas entry, to guide flow control devices in intelligent completions, and to decide if reservoir stimulation is needed in particular horizontal sections. This study will complete and validate these inversion studies.
A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements
Pinan Dawkrajai; Keita Yoshioka; Analis A. Romero; Ding Zhu; A.D. Hill; Larry W. Lake
2005-10-01
This project is motivated by the increasing use of distributed temperature sensors for real-time monitoring of complex wells (horizontal, multilateral and multi-branching wells) to infer the profiles of oil, gas, and water entry. Measured information can be used to interpret flow profiles along the wellbore including junction and build section. In this second project year, we have completed a forward model to predict temperature and pressure profiles in complex wells. As a comprehensive temperature model, we have developed an analytical reservoir flow model which takes into account Joule-Thomson effects in the near well vicinity and multiphase non-isothermal producing wellbore model, and couples those models accounting mass and heat transfer between them. For further inferences such as water coning or gas evaporation, we will need a numerical non-isothermal reservoir simulator, and unlike existing (thermal recovery, geothermal) simulators, it should capture subtle temperature change occurring in a normal production. We will show the results from the analytical coupled model (analytical reservoir solution coupled with numerical multi-segment well model) to infer the anomalous temperature or pressure profiles under various conditions, and the preliminary results from the numerical coupled reservoir model which solves full matrix including wellbore grids. We applied Ramey's model to the build section and used an enthalpy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section.
A COMPREHENSIVE STATISTICALLY-BASED METHOD TO INTERPRET REAL-TIME FLOWING MEASUREMENTS
Pinan Dawkrajai; Analis A. Romero; Keita Yoshioka; Ding Zhu; A.D. Hill; Larry W. Lake
2004-10-01
In this project, we are developing new methods for interpreting measurements in complex wells (horizontal, multilateral and multi-branching wells) to determine the profiles of oil, gas, and water entry. These methods are needed to take full advantage of ''smart'' well instrumentation, a technology that is rapidly evolving to provide the ability to continuously and permanently monitor downhole temperature, pressure, volumetric flow rate, and perhaps other fluid flow properties at many locations along a wellbore; and hence, to control and optimize well performance. In this first year, we have made considerable progress in the development of the forward model of temperature and pressure behavior in complex wells. In this period, we have progressed on three major parts of the forward problem of predicting the temperature and pressure behavior in complex wells. These three parts are the temperature and pressure behaviors in the reservoir near the wellbore, in the wellbore or laterals in the producing intervals, and in the build sections connecting the laterals, respectively. Many models exist to predict pressure behavior in reservoirs and wells, but these are almost always isothermal models. To predict temperature behavior we derived general mass, momentum, and energy balance equations for these parts of the complex well system. Analytical solutions for the reservoir and wellbore parts for certain special conditions show the magnitude of thermal effects that could occur. Our preliminary sensitivity analyses show that thermal effects caused by near-wellbore reservoir flow can cause temperature changes that are measurable with smart well technology. This is encouraging for the further development of the inverse model.
Statistical information of ASAR observations over wetland areas: An interaction model interpretation
NASA Astrophysics Data System (ADS)
Grings, F.; Salvia, M.; Karszenbaum, H.; Ferrazzoli, P.; Perna, P.; Barber, M.; Jacobo Berlles, J.
2010-01-01
This paper presents the results obtained after studying the relation between the statistical parameters that describe the backscattering distribution of junco marshes and their biophysical variables. The results are based on the texture analysis of a time series of Envisat ASAR C-band data (APP mode, V V +HH polarizations) acquired between October 2003 and January 2005 over the Lower Paraná River Delta, Argentina. The image power distributions were analyzed, and we show that the K distribution provides a good fitting of SAR data extracted from wetland observations for both polarizations. We also show that the estimated values of the order parameter of the K distribution can be explained using fieldwork and reasonable assumptions. In order to explore these results, we introduce a radiative transfer based interaction model to simulate the junco marsh σ0 distribution. After analyzing model simulations, we found evidence that the order parameter is related to the junco plant density distribution inside the junco marsh patch. It is concluded that the order parameter of the K distribution could be a useful parameter to estimate the junco plant density. This result is important for basin hydrodynamic modeling, since marsh plant density is the most important parameter to estimate marsh water conductance.
Hysteresis model and statistical interpretation of energy losses in non-oriented steels
NASA Astrophysics Data System (ADS)
Mănescu (Păltânea), Veronica; Păltânea, Gheorghe; Gavrilă, Horia
2016-04-01
In this paper the hysteresis energy losses in two non-oriented industrial steels (M400-65A and M800-65A) were determined, by means of an efficient classical Preisach model, which is based on the Pescetti-Biorci method for the identification of the Preisach density. The excess and the total energy losses were also determined, using a statistical framework, based on magnetic object theory. The hysteresis energy losses, in a non-oriented steel alloy, depend on the peak magnetic polarization and they can be computed using a Preisach model, due to the fact that in these materials there is a direct link between the elementary rectangular loops and the discontinuous character of the magnetization process (Barkhausen jumps). To determine the Preisach density it was necessary to measure the normal magnetization curve and the saturation hysteresis cycle. A system of equations was deduced and the Preisach density was calculated for a magnetic polarization of 1.5 T; then the hysteresis cycle was reconstructed. Using the same pattern for the Preisach distribution, it was computed the hysteresis cycle for 1 T. The classical losses were calculated using a well known formula and the excess energy losses were determined by means of the magnetic object theory. The total energy losses were mathematically reconstructed and compared with those, measured experimentally.
NASA Astrophysics Data System (ADS)
Bouzid, Mohamed; Sellaoui, Lotfi; Khalfaoui, Mohamed; Belmabrouk, Hafedh; Lamine, Abdelmottaleb Ben
2016-02-01
In this work, we studied the adsorption of ethanol on three types of activated carbon, namely parent Maxsorb III and two chemically modified activated carbons (H2-Maxsorb III and KOH-H2-Maxsorb III). This investigation has been conducted on the basis of the grand canonical formalism in statistical physics and on simplified assumptions. This led to three parameter equations describing the adsorption of ethanol onto the three types of activated carbon. There was a good correlation between experimental data and results obtained by the new proposed equation. The parameters characterizing the adsorption isotherm were the number of adsorbed molecules (s) per site n, the density of the receptor sites per unit mass of the adsorbent Nm, and the energetic parameter p1/2. They were estimated for the studied systems by a non linear least square regression. The results show that the ethanol molecules were adsorbed in perpendicular (or non parallel) position to the adsorbent surface. The magnitude of the calculated adsorption energies reveals that ethanol is physisorbed onto activated carbon. Both van der Waals and hydrogen interactions were involved in the adsorption process. The calculated values of the specific surface AS, proved that the three types of activated carbon have a highly microporous surface.
Tsitouridou, Roxani; Papazova, Petia; Simeonova, Pavlina; Simeonov, Vasil
2013-01-01
The size distribution of aerosol particles (PM0.015-PM18) in relation to their soluble inorganic species and total water soluble organic compounds (WSOC) was investigated at an urban site of Thessaloniki, Northern Greece. The sampling period was from February to July 2007. The determined compounds were compared with mass concentrations of the PM fractions for nano (N: 0.015 < Dp < 0.06), ultrafine (UFP: 0.015 < Dp < 0.125), fine (FP: 0.015 < Dp < 2.0) and coarse particles (CP: 2.0 < Dp < 8.0) in order to perform mass closure of the water soluble content for the respective fractions. Electrolytes were the dominant species in all fractions (24-27%), followed by WSOC (16-23%). The water soluble inorganic and organic content was found to account for 53% of the nanoparticle, 48% of the ultrafine particle, 45% of the fine particle and 44% of the coarse particle mass. Correlations between the analyzed species were performed and the effect of local and long-range transported emissions was examined by wind direction and backward air mass trajectories. Multivariate statistical analysis (cluster analysis and principal components analysis) of the collected data was performed in order to reveal the specific data structure. Possible sources of air pollution were identified and an attempt is made to find patterns of similarity between the different sized aerosols and the seasons of monitoring. It was proven that several major latent factors are responsible for the data structure despite the size of the aerosols - mineral (soil) dust, sea sprays, secondary emissions, combustion sources and industrial impact. The seasonal separation proved to be not very specific. PMID:24007436
NASA Astrophysics Data System (ADS)
Lee, J.; Chang, H.
2001-12-01
In this research, we investigate the reciprocal influence between groundwater flow and its salinization occurred in two underground cavern sites, using major ion chemistry, PCA for chemical analysis data, and cross-correlation for various hydraulic data. The study areas are two underground LPG storage facilities constructed in South Sea coast, Yosu, and West Sea coastal regions, Pyeongtaek, Korea. Considerably high concentration of major cations and anions of groundwaters at both sites showed brackish or saline water types. In Yosu site, some great chemical difference of groundwater samples between rainy and dry season was caused by temporal intrusion of high-saline water into propane and butane cavern zone, but not in Pyeongtaek site. Cl/Br ratios and δ 18O- δ D distribution for tracing of salinization source water in both sites revealed that two kind of saline water (seawater and halite-dissolved solution) could influence the groundwater salinization in Yosu site, whereas only seawater intrusion could affect the groundwater chemistry of the observation wells in Pyeongtaek site. PCA performed by 8 and 10 chemical ions as statistical variables in both sites showed that intensive intrusion of seawater through butane cavern was occurred at Yosu site while seawater-groundwater mixing was observed at some observation wells located in the marginal part of Pyeongtaek site. Cross-correlation results revealed that the positive relationship between hydraulic head and cavern operating pressure was far more conspicuous at propane cavern zone in both sites (65 ~90% of correlation coefficients). According to the cross-correlation results of Yosu site, small change of head could provoke massive influx of halite-dissolved solution from surface through vertically developed fracture networks. However in Pyeongtaek site, the pressure-sensitive observation wells are not completely consistent with seawater-mixed wells, and the hydraulic change of heads at these wells related to the
NASA Astrophysics Data System (ADS)
Dralle, D.; Karst, N.; Thompson, S. E.
2015-12-01
Multiple competing theories suggest that power law behavior governs the observed first-order dynamics of streamflow recessions - the important process by which catchments dry-out via the stream network, altering the availability of surface water resources and in-stream habitat. Frequently modeled as: dq/dt = -aqb, recessions typically exhibit a high degree of variability, even within a single catchment, as revealed by significant shifts in the values of "a" and "b" across recession events. One potential source of this variability lies in underlying, hard-to-observe fluctuations in how catchment water storage is partitioned amongst distinct storage elements, each having different discharge behaviors. Testing this and competing hypotheses with widely available streamflow timeseries, however, has been hindered by a power law scaling artifact that obscures meaningful covariation between the recession parameters, "a" and "b". Here we briefly outline a technique that removes this artifact, revealing intriguing new patterns in the joint distribution of recession parameters. Using long-term flow data from catchments in Northern California, we explore temporal variations, and find that the "a" parameter varies strongly with catchment wetness. Then we explore how the "b" parameter changes with "a", and find that measures of its variation are maximized at intermediate "a" values. We propose an interpretation of this pattern based on statistical mechanics, meaning "b" can be viewed as an indicator of the catchment "microstate" - i.e. the partitioning of storage - and "a" as a measure of the catchment macrostate (i.e. the total storage). In statistical mechanics, entropy (i.e. microstate variance, that is the variance of "b") is maximized for intermediate values of extensive variables (i.e. wetness, "a"), as observed in the recession data. This interpretation of "a" and "b" was supported by model runs using a multiple-reservoir catchment toy model, and lends support to the
An Easily Constructed Trigonal Prism Model.
ERIC Educational Resources Information Center
Yamana, Shukichi
1984-01-01
A model of a trigonal prism which is useful for teaching stereochemistry (especially of the neodymium enneahydrate ion), can be made easily by using a sealed, empty envelope. The steps necessary to accomplish this task are presented. (JN)
Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.
2009-01-01
In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation. PMID:19805409
Electronic modules easily separated from heat sink
NASA Technical Reports Server (NTRS)
1965-01-01
Metal heat sink and electronic modules bonded to a thermal bridge can be easily cleaved for removal of the modules for replacement or repair. A thin film of grease between a fluorocarbon polymer film on the metal heat sink and an adhesive film on the modules acts as the cleavage plane.
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
ERIC Educational Resources Information Center
Barner, David; Snedeker, Jesse
2008-01-01
Four experiments investigated 4-year-olds' understanding of adjective-noun compositionality and their sensitivity to statistics when interpreting scalar adjectives. In Experiments 1 and 2, children selected "tall" and "short" items from 9 novel objects called "pimwits" (1-9 in. in height) or from this array plus 4 taller or shorter distractor…
ACECARD. Acquire CoOmmodities Easily Card
Soler, E.E.
1996-09-01
Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.
Acquire CoOmmodities Easily Card
1998-05-29
Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for helpmore » with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.« less
Acquire CoOmmodities Easily Card
Soler, E. E.
1998-05-29
Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.
Quantum of area {Delta}A=8{pi}l{sub P}{sup 2} and a statistical interpretation of black hole entropy
Ropotenko, Kostiantyn
2010-08-15
In contrast to alternative values, the quantum of area {Delta}A=8{pi}l{sub P}{sup 2} does not follow from the usual statistical interpretation of black hole entropy; on the contrary, a statistical interpretation follows from it. This interpretation is based on the two concepts: nonadditivity of black hole entropy and Landau quantization. Using nonadditivity a microcanonical distribution for a black hole is found and it is shown that the statistical weight of a black hole should be proportional to its area. By analogy with conventional Landau quantization, it is shown that quantization of a black hole is nothing but the Landau quantization. The Landau levels of a black hole and their degeneracy are found. The degree of degeneracy is equal to the number of ways to distribute a patch of area 8{pi}l{sub P}{sup 2} over the horizon. Taking into account these results, it is argued that the black hole entropy should be of the form S{sub bh}=2{pi}{center_dot}{Delta}{Gamma}, where the number of microstates is {Delta}{Gamma}=A/8{pi}l{sub P}{sup 2}. The nature of the degrees of freedom responsible for black hole entropy is elucidated. The applications of the new interpretation are presented. The effect of noncommuting coordinates is discussed.
Easily Missed Fractures in the Lower Extremity.
Yu, Joseph S
2015-07-01
As long as radiography remains cheap and provides value in patient care, it will continue to be widely used as a front-line imaging technique. There are limitations to what a radiograph can depict, however. It is imperative to understand the limitations of radiography to avoid pitfalls owing to the overlap of numerous osseous structures. This article reminds the reader of the association between certain radiographic abnormalities and the anatomic relevance in the patient. Although interpretive errors occur in fast-paced, high-volume emergency settings, meticulous attention to changes in the cortex and medullary bone may help to keep errors to a minimum. PMID:26046508
Easily retrievable objects among the NEO population
NASA Astrophysics Data System (ADS)
García Yárnoz, D.; Sanchez, J. P.; McInnes, C. R.
2013-08-01
Asteroids and comets are of strategic importance for science in an effort to understand the formation, evolution and composition of the Solar System. Near-Earth Objects (NEOs) are of particular interest because of their accessibility from Earth, but also because of their speculated wealth of material resources. The exploitation of these resources has long been discussed as a means to lower the cost of future space endeavours. In this paper, we consider the currently known NEO population and define a family of so-called Easily Retrievable Objects (EROs), objects that can be transported from accessible heliocentric orbits into the Earth's neighbourhood at affordable costs. The asteroid retrieval transfers are sought from the continuum of low energy transfers enabled by the dynamics of invariant manifolds; specifically, the retrieval transfers target planar, vertical Lyapunov and halo orbit families associated with the collinear equilibrium points of the Sun-Earth Circular Restricted Three Body problem. The judicious use of these dynamical features provides the best opportunity to find extremely low energy Earth transfers for asteroid material. A catalogue of asteroid retrieval candidates is then presented. Despite the highly incomplete census of very small asteroids, the ERO catalogue can already be populated with 12 different objects retrievable with less than 500 m/s of Δ v. Moreover, the approach proposed represents a robust search and ranking methodology for future retrieval candidates that can be automatically applied to the growing survey of NEOs.
ERIC Educational Resources Information Center
Cruce, Ty M.
2009-01-01
This methodological note illustrates how a commonly used calculation of the Delta-p statistic is inappropriate for categorical independent variables, and this note provides users of logistic regression with a revised calculation of the Delta-p statistic that is more meaningful when studying the differences in the predicted probability of an…
NASA Astrophysics Data System (ADS)
Irving, J.; Knight, R.; Holliger, K.
2007-12-01
The distribution of subsurface water content can be an excellent indicator of soil texture, which strongly influences the unsaturated hydraulic properties controlling vadose zone contaminant transport. Characterizing the heterogeneity in subsurface water content for use in numerical transport models, however, is an extremely difficult task as conventional hydrological measurement techniques do not offer the combined high spatial resolution and coverage required for accurate simulations. A number of recent studies have shown that ground-penetrating radar (GPR) reflection images may contain useful information regarding the statistical structure of subsurface water content. Comparisons of the horizontal correlation structures of radar images and those obtained from water content measurements have shown that, in some cases, the statistical characteristics are remarkably similar. However, a key issue in these studies is that a reflection GPR image is primarily related to changes in subsurface water content, and not the water content distribution directly. As a result, statistics gathered on the reflection image have a very complex relationship with the statistics of the underlying water content distribution, this relationship depending on a number of factors including the frequency of the GPR antennas used. In this work, we attempt to address the above issue by posing the estimation of the statistical structure of water content from reflection GPR data as an inverse problem. Using a simple convolution model for a radar image, we first derive a forward model relating the statistical structure of a radar image to that of the underlying water content distribution. We then use this forward model to invert for the spatial statistics of the water content distribution, given the spatial statistics of the GPR reflection image as data. We do this within a framework of uncertainty, such that realistic statistical bounds can be placed on the information that is inferred. In other
Asfahani, Jamal
2014-02-01
Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted. PMID:24296157
ERIC Educational Resources Information Center
Mickler, J. Ernest
This 60th annual report on collegiate enrollments in the United States is based on data received from 1,635 four-year institutions in the U.S., Puerto Rico, and the U.S. Territories. General notes, survey methodology notes, and a summary of findings are presented. Detailed statistical charts present institutional data on men and women students and…
Pomeau, Yves; Louët, Sabine
2016-06-01
During the StatPhys Conference on 20th July 2016 in Lyon, France, Yves Pomeau and Daan Frenkel will be awarded the most important prize in the field of Statistical Mechanics: the 2016 Boltzmann Medal, named after the Austrian physicist and philosopher Ludwig Boltzmann. The award recognises Pomeau's key contributions to the Statistical Physics of non-equilibrium phenomena in general. And, in particular, for developing our modern understanding of fluid mechanics, instabilities, pattern formation and chaos. He is recognised as an outstanding theorist bridging disciplines from applied mathematics to statistical physics with a profound impact on the neighbouring fields of turbulence and mechanics. In the article Sabine Louët interviews Pomeau, who is an Editor for the European Physical Journal Special Topics. He shares his views and tells how he experienced the rise of Statistical Mechanics in the past few decades. He also touches upon the need to provide funding to people who have the rare ability to discover new things and ideas, and not just those who are good at filling in grant application forms. PMID:27349556
Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira
2014-08-01
This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. PMID:25066170
Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S.; Vives-Rego, Josep
2010-01-01
We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754
Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S; Vives-Rego, Josep
2010-01-01
We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754
Nash, J. Thomas; Frishman, David
1983-01-01
Analytical results for 61 elements in 370 samples from the Ranger Mine area are reported. Most of the rocks come from drill core in the Ranger No. 1 and Ranger No. 3 deposits, but 20 samples are from unmineralized drill core more than 1 km from ore. Statistical tests show that the elements Mg, Fe, F, Be, Co, Li, Ni, Pb, Sc, Th, Ti, V, CI, As, Br, Au, Ce, Dy, La Sc, Eu, Tb, Yb, and Tb have positive association with uranium, and Si, Ca, Na, K, Sr, Ba, Ce, and Cs have negative association. For most lithologic subsets Mg, Fe, Li, Cr, Ni, Pb, V, Y, Sm, Sc, Eu, and Yb are significantly enriched in ore-bearing rocks, whereas Ca, Na, K, Sr, Ba, Mn, Ce, and Cs are significantly depleted. These results are consistent with petrographic observations on altered rocks. Lithogeochemistry can aid exploration, but for these rocks requires methods that are expensive and not amenable to routine use.
Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo
2009-07-21
Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean=35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets. PMID:19524319
Fadda, Valeria; Maratea, Dario; Trippoli, Sabrina; Gatto, Roberta; De Rosa, Mauro; Marinai, Claudio
2014-01-01
Background: No equivalence analysis has yet been conducted on the effectiveness of biologics in rheumatoid arthritis. Equivalence testing has a specific scientific interest, but can also be useful for deciding whether acquisition tenders are feasible for the pharmacological agents being compared. Methods: Our search covered the literature up to August 2014. Our methodology was a combination of standard pairwise meta-analysis, Bayesian network meta-analysis and equivalence testing. The agents examined for their potential equivalence were etanercept, adalimumab, golimumab, certolizumab, and tocilizumab, each in combination with methotrexate (MTX). The reference treatment was MTX monotherapy. The endpoint was ACR50 achievement at 12 months. Odds ratio was the outcome measure. The equivalence margins were established by analyzing the statistical power data of the trials. Results: Our search identified seven randomized controlled trials (2846 patients). No study was retrieved for tocilizumab, and so only four biologics were evaluable. The equivalence range was set at odds ratio from 0.56 to 1.78. There were 10 head-to-head comparisons (4 direct, 6 indirect). Bayesian network meta-analysis estimated the odds ratio (with 90% credible intervals) for each of these comparisons. Between-trial heterogeneity was marked. According to our results, all credible intervals of the 10 comparisons were wide and none of them satisfied the equivalence criterion. A superiority finding was confirmed for the treatment with MTX plus adalimumab or certolizumab in comparison with MTX monotherapy, but not for the other two biologics. Conclusion: Our results indicate that these four biologics improved the rates of ACR50 achievement, but there was an evident between-study heterogeneity. The head-to-head indirect comparisons between individual biologics showed no significant difference, but failed to demonstrate the proof of no difference (i.e. equivalence). This body of evidence presently
NASA Astrophysics Data System (ADS)
Alpert, P. A.; Knopf, D. A.
2014-12-01
Ice nucleation is the initial step in forming mixed-phase and cirrus clouds, and is well established as an important influence on global climate. Laboratory studies investigate at which cloud relevant conditions of temperature (T) and relative humidity (RH) ice nucleation occurs and as a result, numerous fundamentally different ice nucleation descriptions have been proposed for implementation in cloud and climate models. We introduce a new immersion freezing model based on first principles of statistics to simulate individual droplet freezing requiring only three experimental parameters, which are the total number of droplets, the uncertainty of applied surface area per droplet, and the heterogeneous ice nucleation rate coefficient, Jhet, as a function as a function of T and water activity (aw), where in equilibrium RH=aw. Previous studies reporting frozen fractions (f) or Jhet for a droplet population are described by our model for mineral, inorganic, organic, and biological ice nuclei and different techniques including cold stage, oil-immersion, continuous flow diffusion chamber, flow tube, cloud chamber, acoustic levitation and wind levitation experiments. Taking advantage of the physically based parameterization of Jhet by Knopf and Alpert (Faraday Discuss., 165, 513-534, 2013), our model can predict immersion freezing for the entire atmospherically relevant range of T, RH, particle surface area, and time scales, even for conditions unattainable in a laboratory setting. Lastly, we present a rigorous experimental uncertainty analysis using a Monte Carlo method of laboratory derived Jhet and f. These results imply that classical nucleation theory is universal for immersion freezing. In combination with a aw based description of Jhet, this approach allows for a physically based and computational little demanding implementation in climate and cloud models.
Voltage controlled oscillator is easily aligned, has low phase noise
NASA Technical Reports Server (NTRS)
Sydnor, R. L.
1965-01-01
Voltage Controlled Oscillator /VCO/, represented by an equivalent RF circuit, is easily adjusted for optimum performance by varying the circuit parameter. It contains a crystal drive level which is also easily adjusted to obtain minimum phase noise.
Dziurkowska, Ewelina; Wesolowski, Marek
2015-01-01
Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods. PMID:26380376
Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets
ERIC Educational Resources Information Center
Kulp, Christopher W.; Sprechini, Gene D.
2016-01-01
A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…
Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy
2015-10-15
The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii
Easily constructed mini-sextant demonstrates optical principles
NASA Astrophysics Data System (ADS)
Nenninger, Garet G.
2000-04-01
An easily constructed optical instrument for measuring the angle between the Sun and the horizon is described. The miniature sextant relies on multiple reflections to produce multiple images of the sun at fixed angles away from the true Sun.
Description of the Experimental Avionics Systems Integration Laboratory (EASILY)
NASA Technical Reports Server (NTRS)
Outlaw, Bruce K. E.
1994-01-01
The Experimental Avionics Systems Integration Laboratory (EASILY) is a comprehensive facility used for development, integration, and preflight validation of hardware and software systems for the Terminal Area Productivity (TAP) Program's Transport Systems Research Vehicle (TSRV) experimental transport aircraft. This report describes the history, capabilities, and subsystems of EASILY. A functional description of the many subsystems is provided to give potential users the necessary knowledge of the capabilities of this facility.
Easily disassembled electrical connector for high voltage, high frequency connections
Milner, J.R.
1994-05-10
An easily accessible electrical connector capable of rapid assembly and disassembly is described wherein a wide metal conductor sheet may be evenly contacted over the entire width of the conductor sheet by opposing surfaces on the connector which provide an even clamping pressure against opposite surfaces of the metal conductor sheet using a single threaded actuating screw. 13 figures.
Easily disassembled electrical connector for high voltage, high frequency connections
Milner, Joseph R.
1994-01-01
An easily accessible electrical connector capable of rapid assembly and disassembly wherein a wide metal conductor sheet may be evenly contacted over the entire width of the conductor sheet by opposing surfaces on the connector which provide an even clamping pressure against opposite surfaces of the metal conductor sheet using a single threaded actuating screw.
Micromanipulation tool is easily adapted to many uses
NASA Technical Reports Server (NTRS)
Shlichta, P. J.
1967-01-01
A special micromanipulation tool equipped with a plunger mounted in a small tube can be easily adapted to such work operations as cutting, precision clamping, and spot welding of microscopic filaments or other parts. This tool is valuable where extreme steadiness of high magnification is required.
Modular thermoelectric cell is easily packaged in various arrays
NASA Technical Reports Server (NTRS)
Epstein, J.
1965-01-01
Modular thermoelectric cells are easily packaged in various arrays to form power supplies and have desirable voltage and current output characteristics. The cells employ two pairs of thermoelectric elements, each pair being connected in parallel between two sets of aluminum plates. They can be used as solar energy conversion devices.
The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...
A method for easily customizable gradient gel electrophoresis.
Miller, Andrew J; Roman, Brandon; Norstrom, Eric
2016-09-15
Gradient polyacrylamide gel electrophoresis is a powerful tool for the resolution of polypeptides by relative mobility. Here, we present a simplified method for generating polyacrylamide gradient gels for routine analysis without the need for specialized mixing equipment. The method allows for easily customizable gradients which can be optimized for specific polypeptide resolution requirements. Moreover, the method eliminates the possibility of buffer cross contamination in mixing equipment, and the time and resources saved with this method in place of traditional gradient mixing, or the purchase of pre-cast gels, are noteworthy given the frequency with which many labs use gradient gel SDS-PAGE. PMID:27393767
Easily Transported CCD Systems for Use in Astronomy Labs
NASA Astrophysics Data System (ADS)
Meisel, D.
1992-12-01
Relatively inexpensive CCD cameras and portable computers are now easily obtained as commercially available products. I will describe a prototype system that can be used by introductory astronomy students, even urban enviroments, to obtain useful observations of the night sky. It is based on the ST-4 CCDs made by Santa Barbara Instruments Group and Macintosh Powerbook145 computers. Students take outdoor images directly from the college campus, bring the exposures back into the lab and download the images into our networked server. These stored images can then be processed (at a later time) using a variety of image processing programs including a new astronomical version of the popular "freeware" NIH Image package that is currently under development at Geneseo. The prototype of this system will be demonstrated and available for hands-on use during the meeting. This work is supported by NSF ILI Demonstration Grant USE9250493 and Grants from SUNY-GENESEO.
Easily installable behavioral monitoring system with electric field sensor.
Tsukamoto, Sosuke; Machida, Yuichiro; Kameda, Noriyuki; Hoshino, Hiroshi; Tamura, Toshiyo
2007-01-01
This paper describes a wireless behavioral monitoring system equipped with an electric field sensor. The sensor unit was designed to obtain information regarding the usage of home electric appliances such as the television, microwave oven, coffee maker, etc. by measuring the electric field surrounding them. It is assumed that these usage statistics could provide information regarding the indoor behavior of a subject. Since the sensor can be used by simply attaching it to an appliance and does not require any wiring for its installation, this system can be temporarily installed in any ordinary house. A simple interface for selecting the threshold value of appliances' power on/off states was introduced. The experimental results reveal that the proposed system can be installed by individuals in their residences in a short time and the usage statistics of home appliances can be gathered. PMID:18002891
ERIC Educational Resources Information Center
BARGMANN, ROLF E.
THE STUDIES EMBODIED IN THIS REPORT PROPOSE SOME STATISTICAL METHODS OF ORDERING AND ATTAINING RELEVANCY TO HELP THE EDUCATIONAL RESEARCHER CHOOSE AMONG SUCH VARIABLES AS TESTS AND BEHAVIOR RATINGS. CONSTRUCTION OF A MODEL FOR THE ANALYSIS OF CONTINGENCY TABLES, DETERMINATION OF THE MOST APPROPRIATE ORDERING PRINCIPLE IN STEP-DOWN ANALYSIS FOR THE…
ERIC Educational Resources Information Center
BARNETT, F.C.; SAW, J.G.
A WORKING MODEL CAPABLE OF RANKING INDIVIDUALS IN A RANDOM SAMPLE FROM A MULTIVARIATE POPULATION BY SOME CRITERION OF INTEREST WAS DEVELOPED. THE MULTIPLE CORRELATION COEFFICIENT OF RANKS WITH MEASURED VARIATES AS A STATISTIC IN TESTING WHETHER RANKS ARE ASSOCIATED WITH MEASUREMENTS WAS EMPLOYED AND DUBBED "QUASI-RANK MULTIPLE CORRELATION…
Design of easily testable and reconfigurable systolic arrays
Kim, J.H.
1987-01-01
Systolic arrays are considered to be preferred architectures for executing linear algebraic operations. In this thesis, easily testable and reconfigurable (ETAR) systolic arrays are studied to achieve the yield enhancement. New 2-D systolic arrays that lend themselves to easy reconfiguration as well as efficient implementations of algorithms are proposed. The 2-D bidirectional and unidirectional systolic arrays proposed are often better architectures than the rectangular and hexagonal systolic arrays proposed earlier, if one considers area, time and reconfigurability. Methods to design linear and 2-D ETAR systolic arrays are proposed. Procedures to design linear and 2-D unidirectional and bidirectional systolic arrays are given. The main feature of the proposed designs is that the COMUs of the PEs in the linear array can all be tested simultaneously. Another feature is that the throughputs of the reconfigured linear unidirectional as well as bidirectional arrays can remain to be equal to those of the fault-free linear arrays. A reconfiguration algorithm for 2-D systolic arrays is also proposed.
A highly versatile and easily configurable system for plant electrophysiology.
Gunsé, Benet; Poschenrieder, Charlotte; Rankl, Simone; Schröeder, Peter; Rodrigo-Moreno, Ana; Barceló, Juan
2016-01-01
In this study we present a highly versatile and easily configurable system for measuring plant electrophysiological parameters and ionic flow rates, connected to a computer-controlled highly accurate positioning device. The modular software used allows easy customizable configurations for the measurement of electrophysiological parameters. Both the operational tests and the experiments already performed have been fully successful and rendered a low noise and highly stable signal. Assembly, programming and configuration examples are discussed. The system is a powerful technique that not only gives precise measuring of plant electrophysiological status, but also allows easy development of ad hoc configurations that are not constrained to plant studies. •We developed a highly modular system for electrophysiology measurements that can be used either in organs or cells and performs either steady or dynamic intra- and extracellular measurements that takes advantage of the easiness of visual object-oriented programming.•High precision accuracy in data acquisition under electrical noisy environments that allows it to run even in a laboratory close to electrical equipment that produce electrical noise.•The system makes an improvement of the currently used systems for monitoring and controlling high precision measurements and micromanipulation systems providing an open and customizable environment for multiple experimental needs. PMID:27298766
Triazolophthalazines: Easily Accessible Compounds with Potent Antitubercular Activity.
Veau, Damien; Krykun, Serhii; Mori, Giorgia; Orena, Beatrice S; Pasca, Maria R; Frongia, Céline; Lobjois, Valérie; Chassaing, Stefan; Lherbet, Christian; Baltas, Michel
2016-05-19
Tuberculosis (TB) remains one of the major causes of death worldwide, in particular because of the emergence of multidrug-resistant TB. Herein we explored the potential of an alternative class of molecules as anti-TB agents. Thus, a series of novel 3-substituted triazolophthalazines was quickly and easily prepared from commercial hydralazine hydrochloride as starting material and were further evaluated for their antimycobacterial activities and cytotoxicities. Four of the synthesized compounds were found to effectively inhibit the Mycobacterium tuberculosis (M.tb) H37 Rv strain with minimum inhibitory concentration (MIC) values <10 μg mL(-1) , whereas no compounds displayed cytotoxicity against HCT116 human cell lines (IC50 >100 μm). More remarkably, the most potent compounds proved to be active to a similar extent against various multidrug-resistant M.tb strains, thus uncovering a mode of action distinct from that of standard antitubercular agents. Overall, their ease of preparation, combined with their attractive antimycobacterial activities, make such triazolophthalazine-based derivatives promising leads for further development. PMID:27097919
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Metview and VAPOR: Exploring ECMWF forecasts easily in four dimensions
NASA Astrophysics Data System (ADS)
Siemen, Stephan; Kertesz, Sandor; Carver, Glenn
2014-05-01
The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member states and co-operating states with forecasts in the medium time range of up to 15 days as well as other forcasts and analysis. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast. Users can choose to explore ECMWF's forecasts from the web or through visualisation tools installed locally or at ECMWF. ECMWF also develops in co-operation with INPE, Brazil, the Metview meteorological workstation and batch system. Metview enables users to easily analyse and visualise forecasts, and is routinely used by scientists and forecasters at ECMWF and other institutions. While Metview offers high quality visualisation in two-dimensional plots and animations, it uses external tools to visualise data in four dimensions. VAPOR is the Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers. VAPOR provides an interactive 3D visualisation environment that runs on most UNIX and Windows systems equipped with modern 3D graphics cards. VAPOR development is led by the National Center for Atmospheric Research's Scientific Computing Division in collaboration with U.C. Davis and Ohio State University. In this paper we will give an overview of how users, with Metview and access to ECMWF's archive, can visualise forecast data in four dimensions within VAPOR. The process of preparing the data in Metview is the key step and described in detail. The benefits to researchers are highlighted with a case study analysing a given weather scenario.
Bogen, K; Hamilton, T F; Brown, T A; Martinelli, R E; Marchetti, A A; Kehl, S R; Langston, R G
2007-05-01
We have developed refined statistical and modeling techniques to assess low-level uptake and urinary excretion of plutonium from different population group in the northern Marshall Islands. Urinary excretion rates of plutonium from the resident population on Enewetak Atoll and from resettlement workers living on Rongelap Atoll range from <1 to 8 {micro}Bq per day and are well below action levels established under the latest Department regulation 10 CFR 835 in the United States for in vitro bioassay monitoring of {sup 239}Pu. However, our statistical analyses show that urinary excretion of plutonium-239 ({sup 239}Pu) from both cohort groups is significantly positively associated with volunteer age, especially for the resident population living on Enewetak Atoll. Urinary excretion of {sup 239}Pu from the Enewetak cohort was also found to be positively associated with estimates of cumulative exposure to worldwide fallout. Consequently, the age-related trends in urinary excretion of plutonium from Marshallese populations can be described by either a long-term component from residual systemic burdens acquired from previous exposures to worldwide fallout or a prompt (and eventual long-term) component acquired from low-level systemic intakes of plutonium associated with resettlement of the northern Marshall Islands, or some combination of both.
Making large amounts of meteorological plots easily accessible to users
NASA Astrophysics Data System (ADS)
Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin
2015-04-01
implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.
Spirakis, C.S.; Pierson, C.T.; Santos, E.S.; Fishman, N.S.
1983-01-01
Statistical treatment of analytical data from 106 samples of uranium-mineralized and unmineralized or weakly mineralized rocks of the Morrison Formation from the northeastern part of the Church Rock area of the Grants uranium region indicates that along with uranium, the deposits in the northeast Church Rock area are enriched in barium, sulfur, sodium, vanadium and equivalent uranium. Selenium and molybdenum are sporadically enriched in the deposits and calcium, manganese, strontium, and yttrium are depleted. Unlike the primary deposits of the San Juan Basin, the deposits in the northeast part of the Church Rock area contain little organic carbon and several elements that are characteristically enriched in the primary deposits are not enriched or are enriched to a much lesser degree in the Church Rock deposits. The suite of elements associated with the deposits in the northeast part of the Church Rock area is also different from the suite of elements associated with the redistributed deposits in the Ambrosia Lake district. This suggests that the genesis of the Church Rock deposits is different, at least in part, from the genesis of the primary deposits of the San Juan Basin or the redistributed deposits at Ambrosia Lake.
NASA Astrophysics Data System (ADS)
Tema, E.; Zanella, E.; Pavón-Carrasco, F. J.; Kondopoulou, D.; Pavlides, S.
2015-10-01
We present the results of palaeomagnetic analysis on Late Bronge Age pottery from Santorini carried out in order to estimate the thermal effect of the Minoan eruption on the pre-Minoan habitation level. A total of 170 specimens from 108 ceramic fragments have been studied. The ceramics were collected from the surface of the pre-Minoan palaeosol at six different sites, including also samples from the Akrotiri archaeological site. The deposition temperatures of the first pyroclastic products have been estimated by the maximum overlap of the re-heating temperature intervals given by the individual fragments at site level. A new statistical elaboration of the temperature data has also been proposed, calculating at 95 per cent of probability the re-heating temperatures at each site. The obtained results show that the precursor tephra layer and the first pumice fall of the eruption were hot enough to re-heat the underlying ceramics at temperatures 160-230 °C in the non-inhabited sites while the temperatures recorded inside the Akrotiri village are slightly lower, varying from 130 to 200 °C. The decrease of the temperatures registered in the human settlements suggests that there was some interaction between the buildings and the pumice fallout deposits while probably the buildings debris layer caused by the preceding and syn-eruption earthquakes has also contributed to the decrease of the recorded re-heating temperatures.
Linda Stetzenbach; Lauren Nemnich; Davor Novosel
2009-08-31
Three independent tasks had been performed (Stetzenbach 2008, Stetzenbach 2008b, Stetzenbach 2009) to measure a variety of parameters in normative buildings across the United States. For each of these tasks 10 buildings were selected as normative indoor environments. Task 1 focused on office buildings, Task 13 focused on public schools, and Task 0606 focused on high performance buildings. To perform this task it was necessary to restructure the database for the Indoor Environmental Quality (IEQ) data and the Sound measurement as several issues were identified and resolved prior to and during the transfer of these data sets into SPSS. During overview discussions with the statistician utilized in this task it was determined that because the selection of indoor zones (1-6) was independently selected within each task; zones were not related by location across tasks. Therefore, no comparison would be valid across zones for the 30 buildings so the by location (zone) data were limited to three analysis sets of the buildings within each task. In addition, differences in collection procedures for lighting were used in Task 0606 as compared to Tasks 01 & 13 to improve sample collection. Therefore, these data sets could not be merged and compared so effects by-day data were run separately for Task 0606 and only Task 01 & 13 data were merged. Results of the statistical analysis of the IEQ parameters show statistically significant differences were found among days and zones for all tasks, although no differences were found by-day for Draft Rate data from Task 0606 (p>0.05). Thursday measurements of IEQ parameters were significantly different from Tuesday, and most Wednesday measures for all variables of Tasks 1 & 13. Data for all three days appeared to vary for Operative Temperature, whereas only Tuesday and Thursday differed for Draft Rate 1m. Although no Draft Rate measures within Task 0606 were found to significantly differ by-day, Temperature measurements for Tuesday and
NASA Astrophysics Data System (ADS)
Xu, C.; Shyu, J. B. H.; Xu, X.-W.
2014-02-01
The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and their erosion thicknesses with topographic factors, seismic parameters, and their distance from roads. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolutions satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various landslide controlling parameters. These controlling parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more
NASA Astrophysics Data System (ADS)
Xu, C.; Shyu, J. B. H.; Xu, X.
2014-07-01
The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw= 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and the thicknesses of their erosion with topographic, geologic, and seismic parameters. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolution satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various environmental parameters. These parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several
Tsukamoto, S; Hoshino, H; Tamura, T
2008-01-01
This paper describes an indoor behavioral monitoring system for improving the quality of life in ordinary houses. It employs a device that uses weak radio waves for transmitting the obtained data and it is designed such that it can be installed by a user without requiring any technical knowledge or extra constructions. This study focuses on determining the usage statistics of home electric appliances by using an electromagnetic field sensor as a detection device. The usage of the home appliances is determined by measuring the electromagnetic field that can be observed in an area near the appliance. It is assumed that these usage statistics could provide information regarding the indoor behavior of a subject. Since the sensor is not direction sensitive and does not require precise positioning and wiring, it can be easily installed in ordinary houses by the end users. For evaluating the practicability of the sensor unit, several simple tests have been performed. The results indicate that the proposed system could be useful for collecting the usage statistics of home appliances. PMID:19415135
ERIC Educational Resources Information Center
DeHaan, Frank, Ed.
1977-01-01
Describes an interpretative experiment involving the application of symmetry and temperature-dependent proton and fluorine nmr spectroscopy to the solution of structural and kinetic problems in coordination chemistry. (MLH)
ERIC Educational Resources Information Center
Kothe, Elsa Lenz; Berard, Marie-France
2013-01-01
Utilizing a/r/tographic methodology to interrogate interpretive acts in museums, multiple areas of inquiry are raised in this paper, including: which knowledge is assigned the greatest value when preparing a gallery talk; what lies outside of disciplinary knowledge; how invitations to participate invite and disinvite in the same gesture; and what…
ERIC Educational Resources Information Center
Munsart, Craig A.
1993-01-01
Presents an activity that allows students to experience the type of discovery process that paleontologists necessarily followed during the early dinosaur explorations. Students are read parts of a story taken from the "American Journal of Science" and interpret the evidence leading to the discovery of Triceratops and Stegosaurus. (PR)
ERIC Educational Resources Information Center
Pankhurst, Anne
1994-01-01
This paper examines some of the problems associated with interpreting metonymy, a figure of speech in which an attribute or commonly associated feature is used to name or designate something. After defining metonymy and outlining the principles of metonymy, the paper explains the differences between metonymy, synecdoche, and metaphor. It is…
CAinterprTools: An R package to help interpreting Correspondence Analysis' results
NASA Astrophysics Data System (ADS)
Alberti, Gianmarco
2015-09-01
Correspondence Analysis (CA) is a statistical exploratory technique frequently used in many research fields to graphically visualize the structure of contingency tables. Many programs, both commercial and free, perform CA but none of them as yet provides a visual aid to the interpretation of the results. The 'CAinterprTools' package, designed to be used in the free R statistical environment, aims at filling that gap. A novel-to-medium R user has been considered as target. 15 commands enable to easily obtain charts that help (and are relevant to) the interpretation of the CA's results, freeing the user from the need to inspect and scrutinize tabular CA outputs, and to look up values and statistics on which further calculations are necessary. The package also implements tests to assess the significance of the input table's total inertia and individual dimensions.
Reeve, Joanne
2010-01-01
Patient-centredness is a core value of general practice; it is defined as the interpersonal processes that support the holistic care of individuals. To date, efforts to demonstrate their relationship to patient outcomes have been disappointing, whilst some studies suggest values may be more rhetoric than reality. Contextual issues influence the quality of patient-centred consultations, impacting on outcomes. The legitimate use of knowledge, or evidence, is a defining aspect of modern practice, and has implications for patient-centredness. Based on a critical review of the literature, on my own empirical research, and on reflections from my clinical practice, I critique current models of the use of knowledge in supporting individualised care. Evidence-Based Medicine (EBM), and its implementation within health policy as Scientific Bureaucratic Medicine (SBM), define best evidence in terms of an epistemological emphasis on scientific knowledge over clinical experience. It provides objective knowledge of disease, including quantitative estimates of the certainty of that knowledge. Whilst arguably appropriate for secondary care, involving episodic care of selected populations referred in for specialist diagnosis and treatment of disease, application to general practice can be questioned given the complex, dynamic and uncertain nature of much of the illness that is treated. I propose that general practice is better described by a model of Interpretive Medicine (IM): the critical, thoughtful, professional use of an appropriate range of knowledges in the dynamic, shared exploration and interpretation of individual illness experience, in order to support the creative capacity of individuals in maintaining their daily lives. Whilst the generation of interpreted knowledge is an essential part of daily general practice, the profession does not have an adequate framework by which this activity can be externally judged to have been done well. Drawing on theory related to the
A flexible, interpretable framework for assessing sensitivity to unmeasured confounding.
Dorie, Vincent; Harada, Masataka; Carnegie, Nicole Bohme; Hill, Jennifer
2016-09-10
When estimating causal effects, unmeasured confounding and model misspecification are both potential sources of bias. We propose a method to simultaneously address both issues in the form of a semi-parametric sensitivity analysis. In particular, our approach incorporates Bayesian Additive Regression Trees into a two-parameter sensitivity analysis strategy that assesses sensitivity of posterior distributions of treatment effects to choices of sensitivity parameters. This results in an easily interpretable framework for testing for the impact of an unmeasured confounder that also limits the number of modeling assumptions. We evaluate our approach in a large-scale simulation setting and with high blood pressure data taken from the Third National Health and Nutrition Examination Survey. The model is implemented as open-source software, integrated into the treatSens package for the R statistical programming language. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:27139250
SLAR image interpretation keys for geographic analysis
NASA Technical Reports Server (NTRS)
Coiner, J. C.
1972-01-01
A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.
NASA Astrophysics Data System (ADS)
Oldroyd, H. J.; Higgins, C. W.; Huwald, H.; Selker, J. S.; Parlange, M. B.
2011-12-01
Thermal diffusivity of snow is an important physical property associated with key hydrological phenomena such as snow melt and heat and water vapor exchange with the atmosphere. These phenomena have broad implications in studies of climate and heat and water budgets on many scales. However, direct measurements of snow thermal diffusivity require coupled point measurements of thermal conductivity and density, which are nonstationary due to snow metamorphism. Furthermore, thermal conductivity measurements are typically obtained with specialized heating probes or plates and snow density measurements require digging snow pits. Therefore, direct measurements are difficult to obtain with high enough temporal resolution such that direct comparisons with atmospheric conditions can be made. This study uses highly resolved (7.5 to 10 cm for depth and 1min for time) temperature measurements from the Plaine Morte glacier in Switzerland as initial and boundary conditions to numerically solve the 1D heat equation and iteratively optimize for thermal diffusivity. The method uses flux boundary conditions to constrain thermal diffusivity such that spuriously high values in thermal diffusivity are eliminated. Additionally, a t-test ensuring statistical significance between solutions of varied thermal diffusivity result in further constraints on thermal diffusivity that eliminate spuriously low values. The results show that time resolved (1 minute) thermal diffusivity can be determined from easily implemented and inexpensive temperature measurements of seasonal snow with good agreement to widely used parameterizations based on snow density. This high time resolution further affords the ability to explore possible turbulence-induced enhancements to heat and mass transfer in the snow.
VCG Interpretation through Artificial Learning (VITAL)
Gustafson, D.E.; Womble, M.E.; Lancaster, M.C.
1981-01-01
A system for automated VCG interpretation using tools of artificial intelligence and statistical signal processing is presently under development. The system differs substantially from current programs in the extraction of features to be used for rhythm and morphology interpretation. These are found based on ideas of statistical data compression and sufficient statistics rather than the commonly-used waveform measurements. A relatively large data base is being collected to train and evaluate the statistical pattern recognition algorithms used for interpretation. Representative results are presented to illustrate the approach and system performance.
Januszyk, Michael; Gurtner, Geoffrey C
2011-01-01
The scope of biomedical research has expanded rapidly during the past several decades, and statistical analysis has become increasingly necessary to understand the meaning of large and diverse quantities of raw data. As such, a familiarity with this lexicon is essential for critical appraisal of medical literature. This article attempts to provide a practical overview of medical statistics, with an emphasis on the selection, application, and interpretation of specific tests. This includes a brief review of statistical theory and its nomenclature, particularly with regard to the classification of variables. A discussion of descriptive methods for data presentation is then provided, followed by an overview of statistical inference and significance analysis, and detailed treatment of specific statistical tests and guidelines for their interpretation. PMID:21200241
Invention Activities Support Statistical Reasoning
ERIC Educational Resources Information Center
Smith, Carmen Petrick; Kenlan, Kris
2016-01-01
Students' experiences with statistics and data analysis in middle school are often limited to little more than making and interpreting graphs. Although students may develop fluency in statistical procedures and vocabulary, they frequently lack the skills necessary to apply statistical reasoning in situations other than clear-cut textbook examples.…
ERIC Educational Resources Information Center
Scott, Leslie A.; Ingels, Steven J.
2007-01-01
The search for an understandable reporting format has led the National Assessment Governing Board to explore the possibility of measuring and interpreting student performance on the 12th-grade National Assessment of Educational Progress (NAEP), the Nation's Report Card, in terms of readiness for college, the workplace, and the military. This…
ERIC Educational Resources Information Center
Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.
2012-01-01
The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET…
Statistical Reform in School Psychology Research: A Synthesis
ERIC Educational Resources Information Center
Swaminathan, Hariharan; Rogers, H. Jane
2007-01-01
Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.
Summary and interpretive synthesis
1995-05-01
This chapter summarizes the major advances made through our integrated geological studies of the Lisburne Group in northern Alaska. The depositional history of the Lisburne Group is discussed in a framework of depositional sequence stratigraphy. Although individual parasequences (small-scale carbonate cycles) of the Wahoo Limestone cannot be correlated with certainty, parasequence sets can be interpreted as different systems tracts within the large-scale depositional sequences, providing insights on the paleoenvironments, paleogeography and platform geometry. Conodont biostratigraphy precisely established the position of the Mississippian-Pennsylvanian boundary within an important reference section, where established foraminiferal biostratigraphy is inconsistent with respect to conodont-based time-rock boundaries. However, existing Carboniferous conodont zonations are not readily applicable because most zonal indicators are absent, so a local zonation scheme was developed. Diagenetic studies of the Lisburne Group recognized nineteen subaerial exposure surfaces and developed a cement stratigraphy that includes: early cements associated with subaerial exposure surfaces in the Lisburne Group; cements associated with the sub-Permian unconformity; and later burial cements. Subaerial exposure surfaces in the Alapah Limestone are easily explained, being associated with peritidal environments at the boundaries of Sequence A. The Lisburne exposed in ANWR is generally tightly cemented and supermature, but could still be a good reservoir target in the adjacent subsurface of ANWR given the appropriate diagenetic, deformational and thermal history. Our ongoing research on the Lisburne Group will hopefully provide additional insights in future publications.
Cosmic statistics of statistics
NASA Astrophysics Data System (ADS)
Szapudi, István; Colombi, Stéphane; Bernardeau, Francis
1999-12-01
The errors on statistics measured in finite galaxy catalogues are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly non-linear to weakly non-linear scales. For non-linear functions of unbiased estimators, such as the cumulants, the phenomenon of cosmic bias is identified and computed. Since it is subdued by the cosmic errors in the range of applicability of the theory, correction for it is inconsequential. In addition, the method of Colombi, Szapudi & Szalay concerning sampling effects is generalized, adapting the theory for inhomogeneous galaxy catalogues. While previous work focused on the variance only, the present article calculates the cross-correlations between moments and connected moments as well for a statistically complete description. The final analytic formulae representing the full theory are explicit but somewhat complicated. Therefore we have made available a fortran program capable of calculating the described quantities numerically (for further details e-mail SC at colombi@iap.fr). An important special case is the evaluation of the errors on the two-point correlation function, for which this should be more accurate than any method put forward previously. This tool will be immensely useful in the future for assessing the precision of measurements from existing catalogues, as well as aiding the design of new galaxy surveys. To illustrate the applicability of the results and to explore the numerical aspects of the theory qualitatively and quantitatively, the errors and cross-correlations are predicted under a wide range of assumptions for the future Sloan Digital Sky Survey. The principal results concerning the cumulants ξ, Q3 and Q4 is that
Revisiting the statistical analysis of pyroclast density and porosity data
NASA Astrophysics Data System (ADS)
Bernard, B.; Kueppers, U.; Ortiz, H.
2015-07-01
Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data to statistical methods and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any data set easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a data set is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology, we chose two large data sets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose the incorporation of this analysis into future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.
ERIC Educational Resources Information Center
Markus, Keith A.
2008-01-01
One can distinguish statistical models used in causal modeling from the causal interpretations that align them with substantive hypotheses. Causal modeling typically assumes an efficient causal interpretation of the statistical model. Causal modeling can also make use of mereological causal interpretations in which the state of the parts…
An easily regenerable enzyme reactor prepared from polymerized high internal phase emulsions.
Ruan, Guihua; Wu, Zhenwei; Huang, Yipeng; Wei, Meiping; Su, Rihui; Du, Fuyou
2016-04-22
A large-scale high-efficient enzyme reactor based on polymerized high internal phase emulsion monolith (polyHIPE) was prepared. First, a porous cross-linked polyHIPE monolith was prepared by in-situ thermal polymerization of a high internal phase emulsion containing styrene, divinylbenzene and polyglutaraldehyde. The enzyme of TPCK-Trypsin was then immobilized on the monolithic polyHIPE. The performance of the resultant enzyme reactor was assessed according to the conversion ability of Nα-benzoyl-l-arginine ethyl ester to Nα-benzoyl-l-arginine, and the protein digestibility of bovine serum albumin (BSA) and cytochrome (Cyt-C). The results showed that the prepared enzyme reactor exhibited high enzyme immobilization efficiency and fast and easy-control protein digestibility. BSA and Cyt-C could be digested in 10 min with sequence coverage of 59% and 78%, respectively. The peptides and residual protein could be easily rinsed out from reactor and the reactor could be regenerated easily with 4 M HCl without any structure destruction. Properties of multiple interconnected chambers with good permeability, fast digestion facility and easily reproducibility indicated that the polyHIPE enzyme reactor was a good selector potentially applied in proteomics and catalysis areas. PMID:26995089
Chaudhary, Anisha; Kumari, Saroj; Kumar, Rajeev; Teotia, Satish; Singh, Bhanu Pratap; Singh, Avanish Pratap; Dhawan, S K; Dhakate, Sanjay R
2016-04-27
Lightweight and easily foldable with high conductivity, multiwalled carbon nanotube (MWCNT)-based mesocarbon microbead (MCMB) composite paper is prepared using a simple, efficient, and cost-effective strategy. The developed lightweight and conductive composite paper have been reported for the first time as an efficient electromagnetic interference (EMI) shielding material in X-band frequency region having a low density of 0.26 g/cm(3). The investigation revealed that composite paper shows an excellent absorption dominated EMI shielding effectiveness (SE) of -31 to -56 dB at 0.15-0.6 mm thickness, respectively. Specific EMI-SE of as high as -215 dB cm(3)/g exceeds the best values of metal and other low-density carbon-based composites. Additionally, lightweight and easily foldable ability of this composite paper will help in providing stable EMI shielding values even after constant bending. Such intriguing performances open the framework to designing a lightweight and easily foldable composite paper as promising EMI shielding material, especially in next-generation devices and for defense industries. PMID:27035889
A Graphical Interpretation of Probit Coefficients.
ERIC Educational Resources Information Center
Becker, William E.; Waldman, Donald M.
1989-01-01
Contends that, when discrete choice models are taught, particularly the probit model, it is the method rather than the interpretation of the results that is emphasized. This article provides a graphical technique for interpretation of an estimated probit coefficient that will be useful in statistics and econometrics courses. (GG)
Institute of Paper Science Technology
2004-01-30
In recent years, the world has expressed an increasing interest in the recycling of waste paper to supplement the use of virgin fiber as a way to protect the environment. Statistics show that major countries are increasing their use of recycled paper. For example, in 1991 to 1996, the U.S. increased its recovered paper utilization rate from 31% to 39%, Germany went from 50% to 60%, the UK went from 60% to 70%, France increased from 46% to 49%, and China went from 32% to 35% [1]. As recycled fiber levels and water system closures both increase, recycled product quality will need to improve in order for recycled products to compete with products made from virgin fiber [2]. The use of recycled fiber has introduced an increasing level of metal, plastic, and adhesive contamination into the papermaking process which has added to the complexity of the already overwhelming task of providing a uniform and clean recycle furnish. The most harmful of these contaminates is a mixture of adhesives and polymeric substances that are commonly known as stickies. Stickies, which enter the mill with the pulp furnish, are not easily removed from the repulper and become more difficult the further down the system they get. This can be detrimental to the final product quality. Stickies are hydrophobic, tacky, polymeric materials that are introduced into the papermaking system from a mixture of recycled fiber sources. Properties of stickies are very similar to the fibers used in papermaking, viz. size, density, hydrophobicity, and electrokinetic charge. This reduces the probability of their removal by conventional separation processes, such as screening and cleaning, which are based on such properties. Also, their physical and chemical structure allows for them to extrude through screens, attach to fibers, process equipment, wires and felts. Stickies can break down and then reagglomerate and appear at seemingly any place in the mill. When subjected to a number of factors including changes
NASA Astrophysics Data System (ADS)
Grimes, Holly; McMenemy, Karen R.; Ferguson, R. S.
2008-02-01
This paper details how simple PC software, a small network of consumer level PCs, some do-it-yourself hardware and four low cost video projectors can be combined to form an easily configurable and transportable projection display with applications in virtual reality training. This paper provides some observations on the practical difficulties of using such a system, its effectiveness in delivering a VE for training and what benefit may be offered through the deployment of a large number of these low cost environments.
Nicolotti, Orazio; Gillet, Valerie J; Fleming, Peter J; Green, Darren V S
2002-11-01
Deriving quantitative structure-activity relationship (QSAR) models that are accurate, reliable, and easily interpretable is a difficult task. In this study, two new methods have been developed that aim to find useful QSAR models that represent an appropriate balance between model accuracy and complexity. Both methods are based on genetic programming (GP). The first method, referred to as genetic QSAR (or GPQSAR), uses a penalty function to control model complexity. GPQSAR is designed to derive a single linear model that represents an appropriate balance between the variance and the number of descriptors selected for the model. The second method, referred to as multiobjective genetic QSAR (MoQSAR), is based on multiobjective GP and represents a new way of thinking of QSAR. Specifically, QSAR is considered as a multiobjective optimization problem that comprises a number of competitive objectives. Typical objectives include model fitting, the total number of terms, and the occurrence of nonlinear terms. MoQSAR results in a family of equivalent QSAR models where each QSAR represents a different tradeoff in the objectives. A practical consideration often overlooked in QSAR studies is the need for the model to promote an understanding of the biochemical response under investigation. To accomplish this, chemically intuitive descriptors are needed but do not always give rise to statistically robust models. This problem is addressed by the addition of a further objective, called chemical desirability, that aims to reward models that consist of descriptors that are easily interpretable by chemists. GPQSAR and MoQSAR have been tested on various data sets including the Selwood data set and two different solubility data sets. The study demonstrates that the MoQSAR method is able to find models that are at least as good as models derived using standard statistical approaches and also yields models that allow a medicinal chemist to trade statistical robustness for chemical
Shi, Runhua; McLarty, Jerry W
2009-10-01
In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications. PMID:19891281
Safe, Effective and Easily Reproducible Fusion Technique for CV Junction Instability
Sannegowda, Raghavendra Bakki
2015-01-01
Introduction: The Craniovertebral junction (CVJ) refers to a bony enclosure where the occipital bone surrounds the foramen magnum, the atlas and the axis vertebrae. Because of the complexity of structures, CVJ instability is associated with diagnostic and therapeutic problems. Posterior CV fusion procedures have evolved a lot over the last couple of decades. There has been a lookout for one such surgical procedure which is inherently safe, simple, easily reproducible and biomechanically sound. In our study, we present the initial experience the cases of CV junction instrumentation using O-C1-C2 screw & rod construct operated by the author. Aims and Objectives: The current study is a descriptive analysis of the cases of CVJ instability treated by us with instrumentation using O-C1-C2 screw and rod construct fusion technique. Materials and Methods: It is a retrospective, analytical study in which cases of CV junction instability operated by the author between January 2010 to March 2014 were analysed using various clinical, radiological and outcome parameters. Conclusion: CV junction instrumentation using O-C1-C2 screw and rod construct fusion technique proved to be safe, effective, easily reproducible and biomechanically sound technique which can be adopted by all surgeons who may be at any stage of their learning curve. PMID:25954660
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.
ERIC Educational Resources Information Center
Callamaras, Peter
1983-01-01
This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)
ERIC Educational Resources Information Center
Meyer, Donald L.
Bayesian statistical methodology and its possible uses in the behavioral sciences are discussed in relation to the solution of problems in both the use and teaching of fundamental statistical methods, including confidence intervals, significance tests, and sampling. The Bayesian model explains these statistical methods and offers a consistent…
SOCR: Statistics Online Computational Resource
ERIC Educational Resources Information Center
Dinov, Ivo D.
2006-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…
Motivating Play Using Statistical Reasoning
ERIC Educational Resources Information Center
Cross Francis, Dionne I.; Hudson, Rick A.; Lee, Mi Yeon; Rapacki, Lauren; Vesperman, Crystal Marie
2014-01-01
Statistical literacy is essential in everyone's personal lives as consumers, citizens, and professionals. To make informed life and professional decisions, students are required to read, understand, and interpret vast amounts of information, much of which is quantitative. To develop statistical literacy so students are able to make sense of…
Revisiting the statistical analysis of pyroclast density and porosity data
NASA Astrophysics Data System (ADS)
Bernard, B.; Kueppers, U.; Ortiz, H.
2015-03-01
Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used characteristics to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any dataset easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a dataset is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology we chose two large datasets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose add the use of this analysis for future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. PMID:26466186
A 2D zinc-organic network being easily exfoliated into isolated sheets
NASA Astrophysics Data System (ADS)
Yu, Guihong; Li, Ruiqing; Leng, Zhihua; Gan, Shucai
2016-08-01
A metal-organic aggregate, namely {Zn2Cl2(BBC)}n (BBC = 4,4‧,4‧‧-(benzene-1,3,5-triyl-tris(benzene-4,1-diyl))tribenzoate) was obtained by solvothermal synthesis. Its structure is featured with the Zn2(COO)3 paddle-wheels with two chloride anions on axial positions and hexagonal pores in the layers. The exclusion of water in the precursor and the solvent plays a crucial role in the formation of target compound. This compound can be easily dissolved in alkaline solution and exfoliated into isolated sheets, which shows a novel way for the preparation of 2D materials.
Solving block linear systems with low-rank off-diagonal blocks is easily parallelizable
Menkov, V.
1996-12-31
An easily and efficiently parallelizable direct method is given for solving a block linear system Bx = y, where B = D + Q is the sum of a non-singular block diagonal matrix D and a matrix Q with low-rank blocks. This implicitly defines a new preconditioning method with an operation count close to the cost of calculating a matrix-vector product Qw for some w, plus at most twice the cost of calculating Qw for some w. When implemented on a parallel machine the processor utilization can be as good as that of those operations. Order estimates are given for the general case, and an implementation is compared to block SSOR preconditioning.
An easily Prepared Fluorescent pH Probe Based on Dansyl.
Sha, Chunming; Chen, Yuhua; Chen, Yufen; Xu, Dongmei
2016-09-01
A novel fluorescent pH probe from dansyl chloride and thiosemicarbazide was easily prepared and fully characterized by (1)H NMR, (13)C NMR, LC-MS, Infrared spectra and elemental analysis. The probe exhibited high selectivity and sensitivity to H(+) with a pK a value of 4.98. The fluorescence intensity at 510 nm quenched 99.5 % when the pH dropped from 10.88 to 1.98. In addition, the dansyl-based probe could respond quickly and reversibly to the pH variation and various common metal ions showed negligible interference. The recognition could be ascribed to the intramolecular charge transfer caused by the protonation of the nitrogen in the dimethylamino group. PMID:27333798
The study on development of easily chewable and swallowable foods for elderly
Kim, Soojeong
2015-01-01
BACKGROUND/OBJECTS When the functions involved in the ingestion of food occurs failure, not only loss of enjoyment of eating, it will be faced with protein-energy malnutrition. Dysmasesis and difficulty of swallowing occurs in various diseases, but it may be a major cause of aging, and elderly people with authoring and dysmasesis and difficulty of swallowing in the aging society is expected to increase rapidly. SUBJECTS/METHODS In this study, we carried out a survey targeting nutritionists who work in elderly care facilities, and examined characteristics of offering of foods for elderly and the degree of demand of development of easily chewable and swallowable foods for the elderly who can crush foods and take that by their own tongues, and sometimes have difficulty in drinking water and tea. RESULTS In elderly care facilities, it was found to provide a finely chopped food or ground food that was ground with water in a blender for elderly with dysmasesis. Elderly satisfaction of provided foods is appeared overall low. Results of investigating the applicability of foods for elderly and the reflection will of menus, were showed the highest response rate in a gelification method in molecular gastronomic science technics, and results of investigating the frequent food of the elderly; representative menu of beef, pork, white fish, anchovies and spinach, were showed Korean barbecue beef, hot pepper paste stir fried pork, pan fried white fish, stir fried anchovy, seasoned spinach were the highest offer frequency. CONCLUSIONS This study will provide the fundamentals of the development of easily chewable and swallowable foods, gelification, for the elderly. The study will also illustrate that, in the elderly, food undergone gelification will reduce the risk of swallowing down to the wrong pipe and improve overall food preference. PMID:26244082
NASA Astrophysics Data System (ADS)
Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P.; Freitas, Vânia T.; André, Paulo S.; Carlos, Luis D.; Ferreira, Rute A. S.
2015-10-01
This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er3+, Yb3+ codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er3+- and Yb3+-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices.
Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P; Freitas, Vânia T; André, Paulo S; Carlos, Luis D; Ferreira, Rute A S
2015-10-01
This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er(3+), Yb(3+) codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er(3+)- and Yb(3+)-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices. PMID:26374133
Estimation of nutrient values of pig slurries in Southeast Spain using easily determined properties.
Moral, R; Perez-Murcia, M D; Perez-Espinosa, A; Moreno-Caselles, J; Paredes, C
2005-01-01
The contents of available nutrients in pig slurries are not easy to quantify in situ without laboratory facilities, but chemical analyses using standard laboratory methods also take time and are costly and not practical for most farms. Thus, when animal slurries are applied to land, their fertiliser potential is often unknown. In addition, in the last years, the changes in the management of industrial piggeries has changed the nature of the pig slurries vg. decrease of the dry matter content, and consequently the methods and equations used for estimating the nutrient contents in these residues must be checked. In our study, slurry samples were collected from the storage tanks of 36 commercial farms in Southeast Spain. Samples were analysed for pH, electrical conductivity (EC), redox potential (RP), specific density (D), total solids (TS), sedimentable solids (SS), biological oxygen demand (BOD(5)), chemical oxygen demand (COD), total nitrogen (TKN), ammonium nitrogen (AN), organic nitrogen (ON), and total contents of phosphorus, potassium, calcium and magnesium. Relationships between major nutrient levels of pig slurries and a range of physical and chemical properties were investigated. We also analysed the variability of pig slurries according to the production stage. TKN, AN and K were closely related to EC. The P content in slurries was related more closely to solids-derived parameters such as D. The use of multiple properties to estimate nutrient contents in pig slurries, especially for AN and K, seemed unnecessary due to the limited improvement achieved with an additional property. Therefore, electrical conductivity seemed to be the most appropriate single, easily determined parameter for estimation of total and ammonium nitrogen and potassium in pig slurries, with more than 83% of the variance explained. P seemed to be the worst key nutrient for estimation using any easily determined parameter. PMID:16009306
Effects of easily ionizable elements on the liquid sampling atmospheric pressure glow discharge
NASA Astrophysics Data System (ADS)
Venzie, Jacob L.; Marcus, R. Kenneth
2006-06-01
A series of studies has been undertaken to determine the susceptibility of the liquid sampling-atmospheric pressure glow discharge (LS-APGD) atomic emission source to easily ionizable element (EIE) effects. The initial portions of the study involved monitoring the voltage drop across the plasma as a function of the pH to ascertain whether or not the conductivity of the liquid eluent alters the plasma energetics and subsequently the analyte signal strength. It was found that altering the pH (0.0 to 2.0) in the sample matrix did not significantly change the discharge voltage. The emission signal intensities for Cu(I) 327.4 nm, Mo(I) 344.7 nm, Sc(I) 326.9 nm and Hg(I) 253.6 nm were measured as a function of the easily ionizable element (sodium and calcium) concentration in the injection matrix. A range of 0.0 to 0.1% (w/v) EIE in the sample matrix did not cause a significant change in the Cu, Sc, and Mo signal-to-background ratios, with only a slight change noted for Hg. In addition to this test of analyte response, the plasma energetics as a function of EIE concentration are assessed using the ratio of Mg(II) to Mg(I) (280.2 nm and 285.2 nm, respectively) intensities. The Mg(II)/Mg(I) ratio showed that the plasma energetics did not change significantly over the same range of EIE addition. These results are best explained by the electrolytic nature of the eluent acting as an ionic (and perhaps spectrochemical) buffer.
Kogalovskii, M.R.
1995-03-01
This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.
Smith, Alwyn
1969-01-01
This paper is based on an analysis of questionnaires sent to the health ministries of Member States of WHO asking for information about the extent, nature, and scope of morbidity statistical information. It is clear that most countries collect some statistics of morbidity and many countries collect extensive data. However, few countries relate their collection to the needs of health administrators for information, and many countries collect statistics principally for publication in annual volumes which may appear anything up to 3 years after the year to which they refer. The desiderata of morbidity statistics may be summarized as reliability, representativeness, and relevance to current health problems. PMID:5306722
Comprehensive Interpretive Planning.
ERIC Educational Resources Information Center
Kohen, Richard; Sikoryak, Kim
1999-01-01
Discusses interpretive planning and provides information on how to maximize a sense of ownership shared by managers, staff, and other organizational shareholders. Presents practical and effective plans for providing interpretive services. (CCM)
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Binford, Greta J; Gillespie, Rosemary G; Maddison, Wayne P
2016-05-01
Spider venom composition typically differs between sexes. This pattern is anecdotally thought to reflect differences in adult feeding biology. We used a phylogenetic approach to compare intersexual venom dimorphism between species that differ in adult niche dimorphism. Male and female venoms were compared within and between related species of Hawaiian Tetragnatha, a mainland congener, and outgroups. In some species of Hawaiian Tetragnatha adult females spin orb-webs and adult males capture prey while wandering, while in other species both males and females capture prey by wandering. We predicted that, if venom sexual dimorphism is primarily explained by differences in adult feeding biology, species in which both sexes forage by wandering would have monomorphic venoms or venoms with reduced dimorphism relative to species with different adult feeding biology. However, we found striking sexual dimorphism in venoms of both wandering and orb-weaving Tetragnatha species with males having high molecular weight components in their venoms that were absent in females, and a reduced concentration of low molecular weight components relative to females. Intersexual differences in venom composition within Tetragnatha were significantly larger than in non-Tetragnatha species. Diet composition was not different between sexes. This striking venom dimorphism is not easily explained by differences in feeding ecology or behavior. Rather, we hypothesize that the dimorphism reflects male-specific components that play a role in mating biology possibly in sexual stimulation, nuptial gifts and/or mate recognition. PMID:26908290
Easily regenerable solid adsorbents based on polyamines for carbon dioxide capture from the air.
Goeppert, Alain; Zhang, Hang; Czaun, Miklos; May, Robert B; Prakash, G K Surya; Olah, George A; Narayanan, S R
2014-05-01
Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle. PMID:24644023
Shaft seals with an easily removable cylinder holder for low-pressure steam turbines
NASA Astrophysics Data System (ADS)
Zakharov, A. E.; Rodionov, D. A.; Pimenov, E. V.; Sobolev, A. S.
2016-01-01
The article is devoted to the problems that occur at the operation of LPC shaft seals (SS) of turbines, particularly, their bearings. The problems arising from the deterioration of oil-protecting rings of SS and bearings and also the consequences in which they can result are considered. The existing SS housing construction types are considered. Their operational features are specified. A new SS construction type with an easily removable holder is presented. The construction of its main elements is described. The sequence of operations of the repair personnel at the restoration of the new SS type spacings is proposed. The comparative analysis of the new and the existing SS construction types is carried out. The assessment results of the efficiency, the operational convenience, and the economic effect after the installation of the new type seals are given. The conclusions about the offered construction prospects are made by results of the comparative analysis and the carried-out assessment. The main advantage of this design is the possibility of spacings restoration both in SS and in oil-protecting rings during a short-term stop of a turbine, even without its cooling. This construction was successfully tested on the working K-300-23.5 LMP turbine. However, its adaptation for other turbines is quite possible.
Open Window: When Easily Identifiable Genomes and Traits Are in the Public Domain
Angrist, Misha
2014-01-01
“One can't be of an enquiring and experimental nature, and still be very sensible.” - Charles Fort [1] As the costs of personal genetic testing “self-quantification” fall, publicly accessible databases housing people's genotypic and phenotypic information are gradually increasing in number and scope. The latest entrant is openSNP, which allows participants to upload their personal genetic/genomic and self-reported phenotypic data. I believe the emergence of such open repositories of human biological data is a natural reflection of inquisitive and digitally literate people's desires to make genomic and phenotypic information more easily available to a community beyond the research establishment. Such unfettered databases hold the promise of contributing mightily to science, science education and medicine. That said, in an age of increasingly widespread governmental and corporate surveillance, we would do well to be mindful that genomic DNA is uniquely identifying. Participants in open biological databases are engaged in a real-time experiment whose outcome is unknown. PMID:24647311
Easily Regenerable Solid Adsorbents Based on Polyamines for Carbon Dioxide Capture from the Air
Goeppert, A; Zhang, H; Czaun, M; May, RB; Prakash, GKS; Olah, GA; Narayanan, SR
2014-03-18
Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle.
Zhang, Huai-Zhi; Zhang, Chang; Zeng, Guang-Ming; Gong, Ji-Lai; Ou, Xiao-Ming; Huan, Shuang-Yan
2016-06-01
Silver nanoparticle-decorated magnetic graphene oxide (MGO-Ag) was synthesized by doping silver and Fe3O4 nanoparticles on the surface of GO, which was used as an antibacterial agent. MGO-Ag was characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), Energy dispersive X-ray (EDS), X-ray diffraction (XRD), Raman spectroscopy and magnetic property tests. It can be found that magnetic iron oxide nanoparticles and nano-Ag was well dispersed on graphene oxide; and MGO-Ag exhibited excellent antibacterial activity against Escherichia coli and Staphylococcus aureus. Several factors were investigated to study the antibacterial effect of MGO-Ag, such as temperature, time, pH and bacterial concentration. We also found that MGO-Ag maintained high inactivation rates after use six times and can be separated easily after antibacterial process. Moreover, the antibacterial mechanism is discussed and the synergistic effect of GO, Fe3O4 nanoparticles and nano-Ag accounted for high inactivation of MGO-Ag. PMID:26994349
Journalists as Interpretive Communities.
ERIC Educational Resources Information Center
Zelizer, Barbie
1993-01-01
Proposes viewing journalists as members of an interpretive community (not a profession) united by its shared discourse and collective interpretations of key public events. Applies the frame of the interpretive community to journalistic discourse about two events central for American journalists--Watergate and McCarthyism. (SR)
Interpreting. NETAC Teacher Tipsheet.
ERIC Educational Resources Information Center
Darroch, Kathy; Marshall, Liza
This tipsheet explains that an interpreter's role is to facilitate communication and convey all auditory and signed information so that individuals with and without hearing may fully interact. It outlines the common types of services provided by interpreters, and discusses principles guiding the professional behaviors of interpreters. When working…
Hart, Dionne; Bowen, Juan; DeJesus, Ramona; Maldonado, Alejandro; Jiwa, Fatima
2010-04-01
Research has demonstrated that appropriate use of interpreters in clinical encounters improves outcomes and decreases adverse events. This article reviews both the medical reasons for working with trained medical interpreters and the related laws, and offers practical tips for working effectively with interpreters. PMID:20481167
ERIC Educational Resources Information Center
Darroch, Kathleen
2010-01-01
An interpreter's role is to facilitate communication and convey all auditory and signed information so that both hearing and deaf individuals may fully interact. The common types of services provided by interpreters are: (1) American Sign Language (ASL) Interpretation--a visual-gestural language with its own linguistic features; (2) Sign Language…
A Note on a Geometric Interpretation of the Correlation Coefficient.
ERIC Educational Resources Information Center
Marks, Edmond
1982-01-01
An alternate geometric interpretation of the correlation coefficient to that given in most statistics texts for psychology and education is presented. This interpretation is considered to be more consistent with the statistical model for the data, and richer in geometric meaning. (Author)
Enhancing Table Interpretation Skills via Training in Table Creation
ERIC Educational Resources Information Center
Karazsia, Bryan T.
2013-01-01
Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents a new technique for enhancing student interpretation of American Psychological…
The emergent Copenhagen interpretation of quantum mechanics
NASA Astrophysics Data System (ADS)
Hollowood, Timothy J.
2014-05-01
We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR-Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems.
Dynamical interpretation of conditional patterns
NASA Technical Reports Server (NTRS)
Adrian, R. J.; Moser, R. D.; Moin, P.
1988-01-01
While great progress is being made in characterizing the 3-D structure of organized turbulent motions using conditional averaging analysis, there is a lack of theoretical guidance regarding the interpretation and utilization of such information. Questions concerning the significance of the structures, their contributions to various transport properties, and their dynamics cannot be answered without recourse to appropriate dynamical governing equations. One approach which addresses some of these questions uses the conditional fields as initial conditions and calculates their evolution from the Navier-Stokes equations, yielding valuable information about stability, growth, and longevity of the mean structure. To interpret statistical aspects of the structures, a different type of theory which deals with the structures in the context of their contributions to the statistics of the flow is needed. As a first step toward this end, an effort was made to integrate the structural information from the study of organized structures with a suitable statistical theory. This is done by stochastically estimating the two-point conditional averages that appear in the equation for the one-point probability density function, and relating the structures to the conditional stresses. Salient features of the estimates are identified, and the structure of the one-point estimates in channel flow is defined.
Clearly written, easily comprehended? The readability of websites providing information on epilepsy.
Brigo, Francesco; Otte, Willem M; Igwe, Stanley C; Tezzon, Frediano; Nardone, Raffaele
2015-03-01
There is a general need for high-quality, easily accessible, and comprehensive health-care information on epilepsy to better inform the general population about this highly stigmatized neurological disorder. The aim of this study was to evaluate the health literacy level of eight popular English-written websites that provide information on epilepsy in quantitative terms of readability. Educational epilepsy material on these websites, including 41 Wikipedia articles, were analyzed for their overall level of readability and the corresponding academic grade level needed to comprehend the published texts on the first reading. The Flesch Reading Ease (FRE) was used to assess ease of comprehension while the Gunning Fog Index, Coleman-Liau Index, Flesch-Kincaid Grade Level, Automated Readability Index, and Simple Measure of Gobbledygook scales estimated the corresponding academic grade level needed for comprehension. The average readability of websites yielded results indicative of a difficult-to-fairly-difficult readability level (FRE results: 44.0±8.2), with text readability corresponding to an 11th academic grade level (11.3±1.9). The average FRE score of the Wikipedia articles was indicative of a difficult readability level (25.6±9.5), with the other readability scales yielding results corresponding to a 14th grade level (14.3±1.7). Popular websites providing information on epilepsy, including Wikipedia, often demonstrate a low level of readability. This can be ameliorated by increasing access to clear and concise online information on epilepsy and health in general. Short "basic" summaries targeted to patients and nonmedical users should be added to articles published in specialist websites and Wikipedia to ease readability. PMID:25601720
Kokal, Idil; Engel, Annerose; Kirschner, Sebastian; Keysers, Christian
2011-01-01
Why does chanting, drumming or dancing together make people feel united? Here we investigate the neural mechanisms underlying interpersonal synchrony and its subsequent effects on prosocial behavior among synchronized individuals. We hypothesized that areas of the brain associated with the processing of reward would be active when individuals experience synchrony during drumming, and that these reward signals would increase prosocial behavior toward this synchronous drum partner. 18 female non-musicians were scanned with functional magnetic resonance imaging while they drummed a rhythm, in alternating blocks, with two different experimenters: one drumming in-synchrony and the other out-of-synchrony relative to the participant. In the last scanning part, which served as the experimental manipulation for the following prosocial behavioral test, one of the experimenters drummed with one half of the participants in-synchrony and with the other out-of-synchrony. After scanning, this experimenter “accidentally” dropped eight pencils, and the number of pencils collected by the participants was used as a measure of prosocial commitment. Results revealed that participants who mastered the novel rhythm easily before scanning showed increased activity in the caudate during synchronous drumming. The same area also responded to monetary reward in a localizer task with the same participants. The activity in the caudate during experiencing synchronous drumming also predicted the number of pencils the participants later collected to help the synchronous experimenter of the manipulation run. In addition, participants collected more pencils to help the experimenter when she had drummed in-synchrony than out-of-synchrony during the manipulation run. By showing an overlap in activated areas during synchronized drumming and monetary reward, our findings suggest that interpersonal synchrony is related to the brain's reward system. PMID:22110623
Hijnen, W A M; Biraud, D; Cornelissen, E R; van der Kooij, D
2009-07-01
One of the major impediments in the application of spiral-wound membranes in water treatment or desalination is clogging of the feed channel by biofouling which is induced by nutrients in the feedwater. Organic carbon is, under most conditions, limiting the microbial growth. The objective of this study is to assess the relationship between the concentration of an easily assimilable organic compound such as acetate in the feedwater and the pressure drop increase in the feed channel. For this purpose the membrane fouling simulator (MFS) was used as a model for the feed channel of a spiral-wound membrane. This MFS unit was supplied with drinking water enriched with acetate at concentrations ranging from 1 to 1000 microg C x L(-1). The pressure drop (PD) in the feed channel increased at all tested concentrations but not with the blank. The PD increase could be described by a first order process based on theoretical considerations concerning biofilm formation rate and porosity decline. The relationship between the first order fouling rate constant R(f) and the acetate concentration is described with a saturation function corresponding with the growth kinetics of bacteria. Under the applied conditions the maximum R(f) (0.555 d(-1)) was reached at 25 microg acetate-C x L(-1) and the half saturation constant k(f) was estimated at 15 microg acetate-C x L(-1). This value is higher than k(s) values for suspended bacteria grown on acetate, which is attributed to substrate limited growth conditions in the biofilm. The threshold concentration for biofouling of the feed channel is about 1 microg acetate-C x L(-1). PMID:19673281
Farcau, Cosmin; Potara, Monica; Leordean, Cosmin; Boca, Sanda; Astilean, Simion
2013-01-21
The ability to easily prepare Surface Enhanced Raman Scattering (SERS) substrates by the assembly of chemically synthesized gold nanocolloids is of great interest for the advancement of SERS-based optical detection and identification of molecular species of biological or chemical interest, pollutants or warfare agents. In this work we employ three very simple strategies, which can be implemented in any laboratory without the need for specialized equipment, to prepare assemblies of citrate-stabilized spherical gold colloids: (i) drop-coating, which induces the assembly of colloids in so-called coffee rings; (ii) a simplified variant of convective self-assembly (CSA), based on water evaporation in a constrained geometry, which yields highly uniform strips of nanoparticles (NP); (iii) assembly onto chemically functionalized glass surfaces which yields randomly assembled colloids and colloidal clusters. The SERS properties of the resulting colloidal assemblies are comparatively evaluated under multiple excitation lines with p-aminothiophenol (pATP) as a model Raman scatterer. The NP strips obtained by CSA prove to be SERS-active both in the visible and NIR and possess a highly uniform SERS response as demonstrated by spectra at individually selected sites and by confocal SERS mapping. Further it is shown that these NP strips are effective for the detection of cytosine, a DNA component, and for multi-analyte SERS detection. These results, showing how an efficient SERS substrate can be obtained by a very simple assembly method from easy-to-synthesize colloidal gold NP, can have an impact on the development of analytical SERS applications. PMID:23171872
Denion, Eric; Lux, Anne-Laure; Mouriaux, Frédéric; Béraud, Guillaume
2016-01-01
Introduction We aimed to determine the limbal lighting illuminance thresholds (LLITs) required to trigger perception of sclerotic scatter at the opposite non-illuminated limbus (i.e. perception of a light limbal scleral arc) under different levels of ambient lighting illuminance (ALI). Material and Methods Twenty healthy volunteers were enrolled. The iris shade (light or dark) was graded by retrieving the median value of the pixels of a pre-determined zone of a gray-level iris photograph. Mean keratometry and central corneal pachymetry were recorded. Each subject was asked to lie down, and the ALI at eye level was set to mesopic values (10, 20, 40 lux), then photopic values (60, 80, 100, 150, 200 lux). For each ALI level, a light beam of gradually increasing illuminance was applied to the right temporal limbus until the LLIT was reached, i.e. the level required to produce the faint light arc that is characteristic of sclerotic scatter at the nasal limbus. Results After log-log transformation, a linear relationship between the logarithm of ALI and the logarithm of the LLIT was found (p<0.001), a 10% increase in ALI being associated with an average increase in the LLIT of 28.9%. Higher keratometry values were associated with higher LLIT values (p = 0.008) under low ALI levels, but the coefficient of the interaction was very small, representing a very limited effect. Iris shade and central corneal thickness values were not significantly associated with the LLIT. We also developed a censored linear model for ALI values ≤ 40 lux, showing a linear relationship between ALI and the LLIT, in which the LLIT value was 34.4 times greater than the ALI value. Conclusion Sclerotic scatter is more easily elicited under mesopic conditions than under photopic conditions and requires the LLIT value to be much higher than the ALI value, i.e. it requires extreme contrast. PMID:26964096
Rangel, Thomaz C; Michels, Alexandre F; Horowitz, Flávio; Weibel, Daniel E
2015-03-24
Textures that resemble typical fern or bracken plant species (dendrite structures) were fabricated for liquid repellency by dipping copper substrates in a single-step process in solutions containing AgNO3 or by a simple spray liquid application. Superhydrophobic surfaces were produced using a solution containing AgNO3 and trimethoxypropylsilane (TMPSi), and superomniphobic surfaces were produced by a two-step procedure, immersing the copper substrate in a AgNO3 solution and, after that, in a solution containing 1H,1H,2H,2H-perfluorodecyltriethoxysilane (PFDTES). The simple functionalization processes can also be used when the superomniphobic surfaces were destroyed by mechanical stress. By immersion of the wrecked surfaces in the above solutions or by the spray method and soft heating, the copper substrates could be easily repaired, regenerating the surfaces' superrepellency to liquids. The micro- and nanoroughness structures generated on copper surfaces by the deposition of silver dendrites functionalized with TMPSi presented apparent contact angles greater than 150° with a contact angle hysteresis lower than 10° when water was used as the test liquid. To avoid total wettability with very low surface tension liquids, such as rapeseed oil and hexadecane, a thin perfluorinated coating of poly(tetrafluoroethylene) (PTFE), produced by physical vapor deposition, was used. A more efficient perfluorinated coating was obtained when PFDTES was used. The superomniphobic surfaces produced apparent contact angles above 150° with all of the tested liquids, including hexadecane, although the contact angle hysteresis with this liquid was above 10°. The coupling of dendritic structures with TMPSi/PTFE or directly by PFDTES coatings was responsible for the superrepellency of the as-prepared surfaces. These simple, fast, and reliable procedures allow the large area, and cost-effective scale fabrication of superrepellent surfaces on copper substrates for various industrial
Interpreting Abstract Interpretations in Membership Equational Logic
NASA Technical Reports Server (NTRS)
Fischer, Bernd; Rosu, Grigore
2001-01-01
We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.
Perception in statistical graphics
NASA Astrophysics Data System (ADS)
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
Basic statistics in cell biology.
Vaux, David L
2014-01-01
The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind. PMID:25000992
The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.
... cancer statistics across the world. U.S. Cancer Mortality Trends The best indicator of progress against cancer is ... the number of cancer survivors has increased. These trends show that progress is being made against the ...
NASA Astrophysics Data System (ADS)
Hermann, Claudine
Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies - such as semiconductors or lasers - are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.
Automatic interpretation of biological tests.
Boufriche-Boufaïda, Z
1998-03-01
In this article, an approach for an Automatic Interpretation of Biological Tests (AIBT) is described. The developed system is much needed in Preventive Medicine Centers (PMCs). It is designed as a self-sufficient system that could be easily used by trained nurses during the routine visit. The results that the system provides are not only useful to provide the PMC physicians with a preliminary diagnosis, but also allows them more time to focus on the serious cases, making the clinical visit more qualitative. On the other hand, because the use of such a system has been planned for many years, its possibilities for future extensions must be seriously considered. The methodology adopted can be interpreted as a combination of the advantages of two main approaches adopted in current diagnostic systems: the production system approach and the object-oriented system approach. From the rules, the ability of these approaches to capture the deductive processes of the expert in domains where causal mechanisms are often understood are retained. The object-oriented approach guides the elicitation and the engineering of knowledge in such a way that abstractions, categorizations and classifications are encouraged whilst individual instances of objects of any type are recognized as separate, independent entities. PMID:9684093
NASA Astrophysics Data System (ADS)
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
ERIC Educational Resources Information Center
Christensen, Timothy J.; Labov, Jay B.
1997-01-01
Details the construction of a viewing chamber for fruit flies that connects to a dissecting microscope and features a design that enables students to easily move fruit flies in and out of the chamber. (DDR)
Misuse of statistics in surgical literature.
Thiese, Matthew S; Ronna, Brenden; Robbins, Riann B
2016-08-01
Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes. PMID:27621909
Misuse of statistics in surgical literature
Ronna, Brenden; Robbins, Riann B.
2016-01-01
Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes.
The Statistical Literacy Needed to Interpret School Assessment Data
ERIC Educational Resources Information Center
Chick, Helen; Pierce, Robyn
2013-01-01
State-wide and national testing in areas such as literacy and numeracy produces reports containing graphs and tables illustrating school and individual performance. These are intended to inform teachers, principals, and education organisations about student and school outcomes, to guide change and improvement. Given the complexity of the…
Interpretation of psychophysics response curves using statistical physics.
Knani, S; Khalfaoui, M; Hachicha, M A; Mathlouthi, M; Ben Lamine, A
2014-05-15
Experimental gustatory curves have been fitted for four sugars (sucrose, fructose, glucose and maltitol), using a double layer adsorption model. Three parameters of the model are fitted, namely the number of molecules per site n, the maximum response RM and the concentration at half saturation C1/2. The behaviours of these parameters are discussed in relationship to each molecule's characteristics. Starting from the double layer adsorption model, we determined (in addition) the adsorption energy of each molecule on taste receptor sites. The use of the threshold expression allowed us to gain information about the adsorption occupation rate of a receptor site which fires a minimal response at a gustatory nerve. Finally, by means of this model we could calculate the configurational entropy of the adsorption system, which can describe the order and disorder of the adsorbent surface. PMID:24423561
ERIC Educational Resources Information Center
Chicot, Katie; Holmes, Hilary
2012-01-01
The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…
ERIC Educational Resources Information Center
Catley, Alan
2007-01-01
Following the announcement last year that there will be no more math coursework assessment at General Certificate of Secondary Education (GCSE), teachers will in the future be able to devote more time to preparing learners for formal examinations. One of the key things that the author has learned when teaching statistics is that it makes for far…
ERIC Educational Resources Information Center
Erekson, James A.
2010-01-01
Prosody is a means for "reading with expression" and is one aspect of oral reading competence. This theoretical inquiry asserts that prosody is central to interpreting text, and draws distinctions between "syntactic" prosody (for phrasing) and "emphatic" prosody (for interpretation). While reading with expression appears as a criterion in major…
Centralised interpretation of electrocardiograms.
Macfarlane, P W; Watts, M P; Lawrie, T D; Walker, R S
1977-01-01
A system was devised so that a peripheral hospital could transmit electrocardiograms (ECGs) to a central computer for interpretation. The link that transmits both ECGs and reports is provided by the telephone network. Initial results showed that telephone transmission did not significantly affect the accuracy of the ECG interpretation. The centralised computer programme could be much more widely used to provide ECG interpretations. A telephone link would not be justified in health centres, where the demand for ECGs is fairly small, but ECGs recorded at a health centre can be sent to the computer for interpretation and returned the next day. The most cost-effective method of providing computer interpretation for several health centres in a large city would be to have a portable electrocardiograph and transmission facilities, which could be moved from centre to centre. PMID:319866
Groen-Blokhuis, Maria M; Middeldorp, Christel M; M van Beijsterveldt, Catharina E; Boomsma, Dorret I
2011-10-01
In order to estimate the influence of genetic and environmental factors on 'crying without a cause' and 'being easily upset' in 2-year-old children, a large twin study was carried out. Prospective data were available for ~18,000 2-year-old twin pairs from the Netherlands Twin Register. A bivariate genetic analysis was performed using structural equation modeling in the Mx software package. The influence of maternal personality characteristics and demographic and lifestyle factors was tested to identify specific risk factors that may underlie the shared environment of twins. Furthermore, it was tested whether crying without a cause and being easily upset were predictive of later internalizing, externalizing and attention problems. Crying without a cause yielded a heritability estimate of 60% in boys and girls. For easily upset, the heritability was estimated at 43% in boys and 31% in girls. The variance explained by shared environment varied between 35% and 63%. The correlation between crying without a cause and easily upset (r = .36) was explained both by genetic and shared environmental factors. Birth cohort, gestational age, socioeconomic status, parental age, parental smoking behavior and alcohol use during pregnancy did not explain the shared environmental component. Neuroticism of the mother explained a small proportion of the additive genetic, but not of the shared environmental effects for easily upset. Crying without a cause and being easily upset at age 2 were predictive of internalizing, externalizing and attention problems at age 7, with effect sizes of .28-.42. A large influence of shared environmental factors on crying without a cause and easily upset was detected. Although these effects could be specific to these items, we could not explain them by personality characteristics of the mother or by demographic and lifestyle factors, and we recognize that these effects may reflect other maternal characteristics. A substantial influence of genetic factors
Statistics Poster Challenge for Schools
ERIC Educational Resources Information Center
Payne, Brad; Freeman, Jenny; Stillman, Eleanor
2013-01-01
The analysis and interpretation of data are important life skills. A poster challenge for schoolchildren provides an innovative outlet for these skills and demonstrates their relevance to daily life. We discuss our Statistics Poster Challenge and the lessons we have learned.
Fit Indices Versus Test Statistics
ERIC Educational Resources Information Center
Yuan, Ke-Hai
2005-01-01
Model evaluation is one of the most important aspects of structural equation modeling (SEM). Many model fit indices have been developed. It is not an exaggeration to say that nearly every publication using the SEM methodology has reported at least one fit index. Most fit indices are defined through test statistics. Studies and interpretation of…
NASA Technical Reports Server (NTRS)
Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)
2001-01-01
The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Interpretation of Bernoulli's Equation.
ERIC Educational Resources Information Center
Bauman, Robert P.; Schwaneberg, Rolf
1994-01-01
Discusses Bernoulli's equation with regards to: horizontal flow of incompressible fluids, change of height of incompressible fluids, gases, liquids and gases, and viscous fluids. Provides an interpretation, properties, terminology, and applications of Bernoulli's equation. (MVL)
Interpretation of Biosphere Reserves.
ERIC Educational Resources Information Center
Merriman, Tim
1994-01-01
Introduces the Man and the Biosphere Programme (MAB) to monitor the 193 biogeographical provinces of the Earth and the creation of biosphere reserves. Highlights the need for interpreters to become familiar or involved with MAB program activities. (LZ)
BIOMONITORING: INTERPRETATION AND USES
With advanced technologies, it is now possible to measure very low levels of many chemicals in biological fluids. However, the appropriate use and interpretation of biomarkers will depend upon many factors associated with the exposure, adsorption, deposition, metabolism, and eli...
NASA Astrophysics Data System (ADS)
Burns, T. J.; Swanson, E. S.
2016-09-01
A variety of options for interpreting the DØ state, X (5568), are examined. We find that threshold, cusp, molecular, and tetraquark models are all unfavoured. Several experimental tests for unravelling the nature of the signal are suggested.
Customizable tool for ecological data entry, assessment, monitoring, and interpretation
Technology Transfer Automated Retrieval System (TEKTRAN)
The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...
Interpreter-mediated dentistry.
Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F
2015-05-01
The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals. PMID:25828074
NASA Astrophysics Data System (ADS)
Altarelli, Fabrizio; Monasson, Rémi; Zamponi, Francesco
2007-02-01
For large clause-to-variable ratios, typical K-SAT instances drawn from the uniform distribution have no solution. We argue, based on statistical mechanics calculations using the replica and cavity methods, that rare satisfiable instances from the uniform distribution are very similar to typical instances drawn from the so-called planted distribution, where instances are chosen uniformly between the ones that admit a given solution. It then follows, from a recent article by Feige, Mossel and Vilenchik (2006 Complete convergence of message passing algorithms for some satisfiability problems Proc. Random 2006 pp 339-50), that these rare instances can be easily recognized (in O(log N) time and with probability close to 1) by a simple message-passing algorithm.
1986-01-01
Official population data for the USSR are presented for 1985 and 1986. Part 1 (pp. 65-72) contains data on capitals of union republics and cities with over one million inhabitants, including population estimates for 1986 and vital statistics for 1985. Part 2 (p. 72) presents population estimates by sex and union republic, 1986. Part 3 (pp. 73-6) presents data on population growth, including birth, death, and natural increase rates, 1984-1985; seasonal distribution of births and deaths; birth order; age-specific birth rates in urban and rural areas and by union republic; marriages; age at marriage; and divorces. PMID:12178831
NASA Technical Reports Server (NTRS)
Firstenberg, H.
1971-01-01
The statistics are considered of the Monte Carlo method relative to the interpretation of the NUGAM2 and NUGAM3 computer code results. A numerical experiment using the NUGAM2 code is presented and the results are statistically interpreted.
NASA Astrophysics Data System (ADS)
Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo
2015-04-01
A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar
NASA Technical Reports Server (NTRS)
Mc Crae, A. W., Jr.
1967-01-01
Multiconductor instrumentation cable in which the conducting wires are routed through two concentric copper tube sheaths, employing a compressed insulator between the conductors and between the inner and outer sheaths, is durable and easily installed in high thermal or nuclear radiation area. The double sheath is a barrier against moisture, abrasion, and vibration.
SOCR: Statistics Online Computational Resource
Dinov, Ivo D.
2011-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741
Hold My Calls: An Activity for Introducing the Statistical Process
ERIC Educational Resources Information Center
Abel, Todd; Poling, Lisa
2015-01-01
Working with practicing teachers, this article demonstrates, through the facilitation of a statistical activity, how to introduce and investigate the unique qualities of the statistical process including: formulate a question, collect data, analyze data, and interpret data.
NASA Astrophysics Data System (ADS)
Fink, Thomas
2015-03-01
We introduce a simple class of distribution networks which withstand damage by being repairable instead of redundant. Instead of asking how hard it is to disconnect nodes through damage, we ask how easy it is to reconnect nodes after damage. We prove that optimal networks on regular lattices have an expected cost of reconnection proportional to the lattice length, and that such networks have exactly three levels of structural hierarchy. We extend our results to networks subject to repeated attacks, in which the repairs themselves must be repairable. We find that, in exchange for a modest increase in repair cost, such networks are able to withstand any number of attacks. We acknowledge support from the Defense Threat Reduction Agency, BCG and EU FP7 (Growthcom).
Hospitals as interpretation systems.
Thomas, J B; McDaniel, R R; Anderson, R A
1991-01-01
In this study of 162 hospitals, it was found that the chief executive officer's (CEO's) interpretation of strategic issues is related to the existing hospital strategy and the hospital's information processing structure. Strategy was related to interpretation in terms of the extent to which a given strategic issue was perceived as controllable or uncontrollable. Structure was related to the extent to which an issue was defined as positive or negative, was labeled as controllable or uncontrollable, and was perceived as leading to a gain or a loss. Together, strategy and structure accounted for a significant part of the variance in CEO interpretations of strategic events. The theoretical and managerial implications of these findings are discussed. PMID:1991677
Copenhagen and Transactional Interpretations
NASA Astrophysics Data System (ADS)
Görnitz, Th.; von Weizsäcker, C. F.
1988-02-01
The Copenhagen interpretation (CI) never received an authoritative codification. It was a “minimum semantics” of quantum mechanics. We assume that it expresses a theory identical with the Transactional Interpretation (TI) when the observer is included into the system described by the theory. A theory consists of a mathematical structure with a physical semantics. Now, CI rests on an implicit description of the modes of time which is also presupposed by the Second Law of Thermodynamics. Essential is the futuric meaning of probability as a prediction of a relative frequency. CI can be shown to be fully consistent on this basis. The TI and CI can be translated into each other by a simple “dictionary.” The TI describes all events as CI describes past events; CI calls future events possibilities, which TI treats like facts. All predictions of both interpretations agree; we suppose the difference to be linguistic.
Statistics in fusion experiments
NASA Astrophysics Data System (ADS)
McNeill, D. H.
1997-11-01
Since the reasons for the variability in data from plasma experiments are often unknown or uncontrollable, statistical methods must be applied. Reliable interpretation and public accountability require full data sets. Two examples of data misrepresentation at PPPL are analyzed: Te >100 eV on S-1 spheromak.(M. Yamada, Nucl. Fusion 25, 1327 (1985); reports to DoE; etc.) The reported high values (statistical artifacts of Thomson scattering measurements) were selected from a mass of data with an average of 40 eV or less. ``Correlated'' spectroscopic data were meaningless. (2) Extrapolation to Q >=0.5 for DT in TFTR.(D. Meade et al., IAEA Baltimore (1990), V. 1, p. 9; H. P. Furth, Statements to U. S. Congress (1989).) The DD yield used there was the highest through 1990 (>= 50% above average) and the DT to DD power ratio used was about twice any published value. Average DD yields and published yield ratios scale to Q<0.15 for DT, in accord with the observed performance over the last 3 1/2 years. Press reports of outlier data from TFTR have obscured the fact that the DT behavior follows from trivial scaling of the DD data. Good practice in future fusion research would have confidence intervals and other descriptive statistics accompanying reported numerical values (cf. JAMA).
The National Lakes Assessment (NLA) and other lake survey and monitoring efforts increasingly rely upon biological assemblage data to define lake condition. Information concerning the multiple dimensions of physical and chemical habitat is necessary to interpret this biological ...
The ADAMS interactive interpreter
Rietscha, E.R.
1990-12-17
The ADAMS (Advanced DAta Management System) project is exploring next generation database technology. Database management does not follow the usual programming paradigm. Instead, the database dictionary provides an additional name space environment that should be interactively created and tested before writing application code. This document describes the implementation and operation of the ADAMS Interpreter, an interactive interface to the ADAMS data dictionary and runtime system. The Interpreter executes individual statements of the ADAMS Interface Language, providing a fast, interactive mechanism to define and access persistent databases. 5 refs.
How to spot a statistical problem: advice for a non-statistical reviewer.
Greenwood, Darren C; Freeman, Jennifer V
2015-01-01
Statistical analyses presented in general medical journals are becoming increasingly sophisticated. BMC Medicine relies on subject reviewers to indicate when a statistical review is required. We consider this policy and provide guidance on when to recommend a manuscript for statistical evaluation. Indicators for statistical review include insufficient detail in methods or results, some common statistical issues and interpretation not based on the presented evidence. Reviewers are required to ensure that the manuscript is methodologically sound and clearly written. Within that context, they are expected to provide constructive feedback and opinion on the statistical design, analysis, presentation and interpretation. If reviewers lack the appropriate background to positively confirm the appropriateness of any of the manuscript's statistical aspects, they are encouraged to recommend it for expert statistical review. PMID:26521808
INCREASING SCIENTIFIC POWER WITH STATISTICAL POWER
A brief survey of basic ideas in statistical power analysis demonstrates the advantages and ease of using power analysis throughout the design, analysis, and interpretation of research. he power of a statistical test is the probability of rejecting the null hypothesis of the test...
Screencast Tutorials Enhance Student Learning of Statistics
ERIC Educational Resources Information Center
Lloyd, Steven A.; Robertson, Chuck L.
2012-01-01
Although the use of computer-assisted instruction has rapidly increased, there is little empirical research evaluating these technologies, specifically within the context of teaching statistics. The authors assessed the effect of screencast tutorials on learning outcomes, including statistical knowledge, application, and interpretation. Students…
Listening and Message Interpretation
ERIC Educational Resources Information Center
Edwards, Renee
2011-01-01
Message interpretation, the notion that individuals assign meaning to stimuli, is related to listening presage, listening process, and listening product. As a central notion of communication, meaning includes (a) denotation and connotation, and (b) content and relational meanings, which can vary in ambiguity and vagueness. Past research on message…
Social Maladjustment: An Interpretation.
ERIC Educational Resources Information Center
Center, David B.
The exclusionary term, "social maladjustment," the definition in Public Law 94-142 (the Education for All Handicapped Children Act) of serious emotional disturbance, has been an enigma for special education. This paper attempts to limit the interpretation of social maladjustment in order to counter effects of such decisions as "Honig vs. Doe" in…
Explaining the Interpretive Mind.
ERIC Educational Resources Information Center
Brockmeier, Jens
1996-01-01
Examines two prominent positions in the epistemological foundations of psychology--Piaget's causal explanatory claims and Vygotsky's interpretive understanding; contends that they need to be placed in their wider philosophical contexts. Argues that the danger of causally explaining cultural practices through which human beings construct and…
Interpreting the Constitution.
ERIC Educational Resources Information Center
Brennan, William J., Jr.
1987-01-01
Discusses constitutional interpretations relating to capital punishment and protection of human dignity. Points out the document's effectiveness in creating a new society by adapting its principles to current problems and needs. Considers two views of the Constitution that lead to controversy over the legitimacy of judicial decisions. (PS)
Interpreting & Biomechanics. PEPNet Tipsheet
ERIC Educational Resources Information Center
PEPNet-Northeast, 2001
2001-01-01
Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…
Abstract Interpreters for Free
NASA Astrophysics Data System (ADS)
Might, Matthew
In small-step abstract interpretations, the concrete and abstract semantics bear an uncanny resemblance. In this work, we present an analysis-design methodology that both explains and exploits that resemblance. Specifically, we present a two-step method to convert a small-step concrete semantics into a family of sound, computable abstract interpretations. The first step re-factors the concrete state-space to eliminate recursive structure; this refactoring of the state-space simultaneously determines a store-passing-style transformation on the underlying concrete semantics. The second step uses inference rules to generate an abstract state-space and a Galois connection simultaneously. The Galois connection allows the calculation of the "optimal" abstract interpretation. The two-step process is unambiguous, but nondeterministic: at each step, analysis designers face choices. Some of these choices ultimately influence properties such as flow-, field- and context-sensitivity. Thus, under the method, we can give the emergence of these properties a graph-theoretic characterization. To illustrate the method, we systematically abstract the continuation-passing style lambda calculus to arrive at two distinct families of analyses. The first is the well-known k-CFA family of analyses. The second consists of novel "environment-centric" abstract interpretations, none of which appear in the literature on static analysis of higher-order programs.
ERIC Educational Resources Information Center
Layton, Lyn; Miller, Carol
2004-01-01
The National Literacy Strategy (NLS) was introduced into schools in England in 1998 with the aim of raising the literacy attainments of primary-aged children. The Framework for Teaching the Literacy Hour, a key component of the NLS, proposes an interpretation of literacy that emphasises reading, writing and spelling skills. An investigation of the…
Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho
2013-11-01
Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination. PMID:24057000
Linking numbers, spin, and statistics of solitons
NASA Technical Reports Server (NTRS)
Wilczek, F.; Zee, A.
1983-01-01
The spin and statistics of solitons in the (2 + 1)- and (3 + 1)-dimensional nonlinear sigma models is considered. For the (2 + 1)-dimensional case, there is the possibility of fractional spin and exotic statistics; for 3 + 1 dimensions, the usual spin-statistics relation is demonstrated. The linking-number interpretation of the Hopf invariant and the use of suspension considerably simplify the analysis.
Ranald Macdonald and statistical inference.
Smith, Philip T
2009-05-01
Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing. PMID:19351454
Evaluation of Psychotherapeutic Interpretations
POGGE, DAVID L.; DOUGHER, MICHAEL J.
1992-01-01
If much psychotherapy literature goes unread and unused by therapists, one reason may be the apparent irrelevance of theory-derived hypotheses to actual practice. Methods that uncover tacit knowledge that practicing therapists already possess can provide the empirical basis for more relevant theories and the testing of more meaningful hypotheses. This study demonstrates application of the phenomenological method to the question of evaluating psychotherapy. To discover how experienced psychotherapists evaluate interpretations made in actual psychotherapy sessions, therapists were asked to evaluate such interpretations from videotapes; analysis of responses yielded a set of 10 dimensions of evaluation. Such methods offer both practical utility and a source of theoretical growth anchored in the real world of the practicing therapist. PMID:22700101
Semantic interpretation of nominalizations
Hull, R.D.; Gomez, F.
1996-12-31
A computational approach to the semantic interpretation of nominalizations is described. Interpretation of normalizations involves three tasks: deciding whether the normalization is being used in a verbal or non-verbal sense; disambiguating the normalized verb when a verbal sense is used; and determining the fillers of the thematic roles of the verbal concept or predicate of the nominalization. A verbal sense can be recognized by the presence of modifiers that represent the arguments of the verbal concept. It is these same modifiers which provide the semantic clues to disambiguate the normalized verb. In the absence of explicit modifiers, heuristics are used to discriminate between verbal and non-verbal senses. A correspondence between verbs and their nominalizations is exploited so that only a small amount of additional knowledge is needed to handle the nominal form. These methods are tested in the domain of encyclopedic texts and the results are shown.
Interpreting uncertainty terms.
Holtgraves, Thomas
2014-08-01
Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on. PMID:25090127
NASA Astrophysics Data System (ADS)
Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho
2013-10-01
Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination.Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination. Electronic supplementary information (ESI) available: Experimental methods for CZTS nanocrystal synthesis, device fabrication, and characterization; the size distribution and energy dispersive X-ray (EDX) spectra of the synthesized CZTS nanoparticles; UV-vis spectra of the
Luo,Y.; Tepikian, S.; Fischer, W.; Robert-Demolaize, G.; Trbojevic, D.
2009-01-02
Based on the contributions of the chromatic sextupole families to the half-integer resonance driving terms, we discuss how to sort the chromatic sextupoles in the arcs of the Relativistic Heavy Ion Collider (RHIC) to easily and effectively correct the second order chromaticities. We propose a method with 4 knobs corresponding to 4 pairs of chromatic sextupole families to online correct the second order chromaticities. Numerical simulation justifies this method, showing that this method reduces the unbalance in the correction strengths of sextupole families and avoids the reversal of sextupole polarities. Therefore, this method yields larger dynamic apertures for the proposed RHIC 2009 100GeV polarized proton run lattices.
1999-09-30
The Institute studied the adsorption of cationic pressure-sensitive adhesive (PSA) on wood fiber, and the buildup of PSA in a closed water system during paper recycling; the results are presented. Georgia Tech worked to develop an environmentally friendly polymerization process to synthesize a novel re-dispersible PSA by co-polymerizing an oil-soluble monomer (butyl acrylate) and a cationic monomer MAEPTAC; results are presented. At the University of Georgia at Athens the project focused on the synthesis of water-soluble and easily removable cationic polymer PSAs.
Cancer survival: an overview of measures, uses, and interpretation.
Mariotto, Angela B; Noone, Anne-Michelle; Howlader, Nadia; Cho, Hyunsoon; Keel, Gretchen E; Garshell, Jessica; Woloshin, Steven; Schwartz, Lisa M
2014-11-01
Survival statistics are of great interest to patients, clinicians, researchers, and policy makers. Although seemingly simple, survival can be confusing: there are many different survival measures with a plethora of names and statistical methods developed to answer different questions. This paper aims to describe and disseminate different survival measures and their interpretation in less technical language. In addition, we introduce templates to summarize cancer survival statistic organized by their specific purpose: research and policy versus prognosis and clinical decision making. PMID:25417231
Cosmetic Plastic Surgery Statistics
2014 Cosmetic Plastic Surgery Statistics Cosmetic Procedure Trends 2014 Plastic Surgery Statistics Report Please credit the AMERICAN SOCIETY OF PLASTIC SURGEONS when citing statistical data or using ...
FIDEA: a server for the functional interpretation of differential expression analysis.
D'Andrea, Daniel; Grassi, Luigi; Mazzapioda, Mariagiovanna; Tramontano, Anna
2013-07-01
The results of differential expression analyses provide scientists with hundreds to thousands of differentially expressed genes that need to be interpreted in light of the biology of the specific system under study. This requires mapping the genes to functional classifications that can be, for example, the KEGG pathways or InterPro families they belong to, their GO Molecular Function, Biological Process or Cellular Component. A statistically significant overrepresentation of one or more category terms in the set of differentially expressed genes is an essential step for the interpretation of the biological significance of the results. Ideally, the analysis should be performed by scientists who are well acquainted with the biological problem, as they have a wealth of knowledge about the system and can, more easily than a bioinformatician, discover less obvious and, therefore, more interesting relationships. To allow experimentalists to explore their data in an easy and at the same time exhaustive fashion within a single tool and to test their hypothesis quickly and effortlessly, we developed FIDEA. The FIDEA server is located at http://www.biocomputing.it/fidea; it is free and open to all users, and there is no login requirement. PMID:23754850
Autoadaptivity and optimization in distributed ECG interpretation.
Augustyniak, Piotr
2010-03-01
This paper addresses principal issues of the ECG interpretation adaptivity in a distributed surveillance network. In the age of pervasive access to wireless digital communication, distributed biosignal interpretation networks may not only optimally solve difficult medical cases, but also adapt the data acquisition, interpretation, and transmission to the variable patient's status and availability of technical resources. The background of such adaptivity is the innovative use of results from the automatic ECG analysis to the seamless remote modification of the interpreting software. Since the medical relevance of issued diagnostic data depends on the patient's status, the interpretation adaptivity implies the flexibility of report content and frequency. Proposed solutions are based on the research on human experts behavior, procedures reliability, and usage statistics. Despite the limited scale of our prototype client-server application, the tests yielded very promising results: the transmission channel occupation was reduced by 2.6 to 5.6 times comparing to the rigid reporting mode and the improvement of the remotely computed diagnostic outcome was achieved in case of over 80% of software adaptation attempts. PMID:20064764
Computerized interpretation of solitary pulmonary nodules
NASA Astrophysics Data System (ADS)
Suzuki, Hideo; Takabatake, Hirotsugu; Mori, Masaki; Mitani, Masanobu; Natori, Hiroshi
1998-04-01
In physicians' interpretation, morphologic characteristics of pulmonary nodules are not only important signs for the discrimination, but also important features for the diagnosis with a reasonable degree of confidence. This paper describes about the computerized interpretation system which is developed to analyze the relation between the measuring values and the morphologic characteristics, and to make clear the logic of physicians' diagnosis. We think that the four basic morphologic characteristics of the discriminative diagnosis between benign and malignant nodules exist which are: (1) the density; (2) the homogeneity; (3) the definition; and (4) the convergence. To obtain each grade of the parameters, we developed an interpretation system. On the other hand, to obtain digital feature values, we used our computer aided diagnosis system. Interpretation experiments were performed by using 15 benign and 19 malignant cases of chest x-ray CT images. As the result of a statistical analysis, some digital features have the significant differences between benign and malignant nodules, and the morphological characteristics have also differences. Therefore the computerized system is feasible to help physicians' interpretation to distinct between malignant and benign nodules by showing digital feature values as some references.
How to use and interpret hormone ratios.
Sollberger, Silja; Ehlert, Ulrike
2016-01-01
Hormone ratios have become increasingly popular throughout the neuroendocrine literature since they offer a straightforward way to simultaneously analyze the effects of two interdependent hormones. However, the analysis of ratios is associated with statistical and interpretational concerns which have not been sufficiently considered in the context of endocrine research. The aim of this article, therefore, is to demonstrate and discuss these issues, and to suggest suitable ways to address them. In a first step, we use exemplary testosterone and cortisol data to illustrate that one major concern of ratios lies in their distribution and inherent asymmetry. As a consequence, results of parametric statistical analyses are affected by the ultimately arbitrary decision of which way around the ratio is computed (i.e., A/B or B/A). We suggest the use of non-parametric methods as well as the log-transformation of hormone ratios as appropriate methods to deal with these statistical problems. However, in a second step, we also discuss the complicated interpretation of ratios, and propose moderation analysis as an alternative and oftentimes more insightful approach to ratio analysis. In conclusion, we suggest that researchers carefully consider which statistical approach is best suited to investigate reciprocal hormone effects. With regard to the hormone ratio method, further research is needed to specify what exactly this index reflects on the biological level and in which cases it is a meaningful variable to analyze. PMID:26521052
Statistical Modeling of SAR Images: A Survey
Gao, Gui
2010-01-01
Statistical modeling is essential to SAR (Synthetic Aperture Radar) image interpretation. It aims to describe SAR images through statistical methods and reveal the characteristics of these images. Moreover, statistical modeling can provide a technical support for a comprehensive understanding of terrain scattering mechanism, which helps to develop algorithms for effective image interpretation and creditable image simulation. Numerous statistical models have been developed to describe SAR image data, and the purpose of this paper is to categorize and evaluate these models. We first summarize the development history and the current researching state of statistical modeling, then different SAR image models developed from the product model are mainly discussed in detail. Relevant issues are also discussed. Several promising directions for future research are concluded at last. PMID:22315568
Phillips, Craig B; Iline, Ilia I; Richards, Nicola K; Novoselov, Max; McNeill, Mark R
2013-10-01
Quickly, accurately, and easily assessing the efficacy of treatments to control sessile arthropods (e.g., scale insects) and stationary immature life stages (e.g., eggs and pupae) is problematic because it is difficult to tell whether treated organisms are alive or dead. Current approaches usually involve either maintaining organisms in the laboratory to observe them for development, gauging their response to physical stimulation, or assessing morphological characters such as turgidity and color. These can be slow, technically difficult, or subjective, and the validity of methods other than laboratory rearing has seldom been tested. Here, we describe development and validation of a quick easily used biochemical colorimetric assay for measuring the viability of arthropods that is sufficiently sensitive to test even very small organisms such as white fly eggs. The assay was adapted from a technique for staining the enzyme hexokinase to signal the presence of adenosine triphosphate in viable specimens by reducing a tetrazolium salt to formazan. Basic laboratory facilities and skills are required for production of the stain, but no specialist equipment, expertise, or facilities are needed for its use. PMID:24224241
Rago, Angela; Latagliata, Roberto; Montanaro, Marco; Montefusco, Enrico; Andriani, Alessandro; Crescenzi, Sabrina Leonetti; Mecarocci, Sergio; Spirito, Francesca; Spadea, Antonio; Recine, Umberto; Cicconi, Laura; Avvisati, Giuseppe; Cedrone, Michele; Breccia, Massimo; Porrini, Raffaele; Villivà, Nicoletta; De Gregoris, Cinzia; Alimena, Giuliana; D'Arcangelo, Enzo; Guglielmelli, Paola; Lo-Coco, Francesco; Vannucchi, Alessandro; Cimino, Giuseppe
2015-03-01
To predict leukemic transformation (LT), we evaluated easily detectable diagnostic parameters in 338 patients with primary myelofibrosis (PMF) followed in the Latium region (Italy) between 1981 and 2010. Forty patients (11.8%) progressed to leukemia, with a resulting 10-year leukemia-free survival (LFS) rates of 72%. Hb (<10g/dL), and circulating blasts (≥1%) were the only two independent prognostic for LT at the multivariate analysis. Two hundred-fifty patients with both the two parameters available were grouped as follows: low risk (none or one factor)=216 patients; high risk (both factors)=31 patients. The median LFS times were 269 and 45 months for the low and high-risk groups, respectively (P<.0001). The LT predictive power of these two parameters was confirmed in an external series of 270 PMF patients from Tuscany, in whom the median LFS was not reached and 61 months for the low and high risk groups, respectively (P<.0001). These results establish anemia and circulating blasts, two easily and universally available parameters, as strong predictors of LT in PMF and may help to improve prognostic stratification of these patients particularly in countries with low resources where more sophisticated molecular testing is unavailable. PMID:25636356
Interpretation Techniques Development
NASA Technical Reports Server (NTRS)
Alford, W. L.
1973-01-01
The processes, algorithms and procedures for extraction and interpretation of ERTS-1 data are discussed. Analysis of data acquired temporally is possible through geometric correction, correlation, and registration techniques. The powerful techniques in image enhancement developed for the lunar and planetary programs are valuable for Earth Resources Survey programs. There is evidence that both optical and digital methods of spatial information extraction can provide valuable sources of data information the ERTS system. The techniques available, even for a limited number of bands and limited resolution can be effectively used to extract much of the information required by resource managers.
NASA Technical Reports Server (NTRS)
Runcorn, S. K.
1985-01-01
The superposition of the first satellite geoid determined by Iszak upon Ootilla's geoid was based on surface gravity determinations. Good agreement was observed except over the Pacific area of the globe. The poor agreement over the Pacific was interpreted as the result of inadequate observations there. Many geoids were determined from satellite observations, including Doppler measurements. It is found that the geoid is the result of density differences in the mantle maintained since the primeval Earth by its finite strength. Various models based on this assumption are developed.
NASA Astrophysics Data System (ADS)
Pospieszalski, M. W.
2010-10-01
The simple noise models of field effect and bipolar transistors reviewed in this article are quite useful in engineering practice, as illustrated by measured and modeled results. The exact and approximate expressions for the noise parameters of FETs and bipolar transistors reveal certain common noise properties and some general noise properties of both devices. The usefulness of these expressions in interpreting the dependence of measured noise parameters on frequency, bias, and temperature and, consequently, in checking of consistency of measured data has been demonstrated.
Interpretation of extragalactic jets
Norman, M.L.
1985-01-01
The nature of extragalatic radio jets is modeled. The basic hypothesis of these models is that extragalatic jets are outflows of matter which can be described within the framework of fluid dynamics and that the outflows are essentially continuous. The discussion is limited to the interpretation of large-scale (i.e., kiloparsec-scale) jets. The central problem is to infer the physical parameters of the jets from observed distributions of total and polarized intensity and angle of polarization as a function of frequency. 60 refs., 6 figs.
Structural interpretation of seismic data and inherent uncertainties
NASA Astrophysics Data System (ADS)
Bond, Clare
2013-04-01
Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Wang, Q. J.
2015-03-01
The Australian Bureau of Meteorology produces statistical and dynamic seasonal streamflow forecasts. The statistical and dynamic forecasts are similarly reliable in ensemble spread; however, skill varies by catchment and season. Therefore, it may be possible to optimize forecasting skill by weighting and merging statistical and dynamic forecasts. Two model averaging methods are evaluated for merging forecasts for 12 locations. The first method, Bayesian model averaging (BMA), applies averaging to forecast probability densities (and thus cumulative probabilities) for a given forecast variable value. The second method, quantile model averaging (QMA), applies averaging to forecast variable values (quantiles) for a given cumulative probability (quantile fraction). BMA and QMA are found to perform similarly in terms of overall skill scores and reliability in ensemble spread. Both methods improve forecast skill across catchments and seasons. However, when both the statistical and dynamical forecasting approaches are skillful but produce, on special occasions, very different event forecasts, the BMA merged forecasts for these events can have unusually wide and bimodal distributions. In contrast, the distributions of the QMA merged forecasts for these events are narrower, unimodal and generally more smoothly shaped, and are potentially more easily communicated to and interpreted by the forecast users. Such special occasions are found to be rare. However, every forecast counts in an operational service, and therefore the occasional contrast in merged forecasts between the two methods may be more significant than the indifference shown by the overall skill and reliability performance.
An intentional interpretive perspective
Neuman, Paul
2004-01-01
To the extent that the concept of intention has been addressed within behavior analysis, descriptions of intention have been general and have not specifically included important distinctions that differentiate a behavior-analytic approach from vernacular definitions of intention. A fundamental difference between a behavior-analytic approach and most other psychological approaches is that other approaches focus on the necessity of intentions to explain behavior, whereas a behavior-analytic approach is directed at understanding the interplay between behavior and environment. Behavior-analytic interpretations include the relations between the observer's behavior and the environment. From a behavior-analytic perspective, an analysis of the observer's interpretations of an individual's behavior is inherent in the subsequent attribution of intention. The present agenda is to provide a behavior-analytic account of attributing intention that identifies the establishing conditions for speaking of intention. Also addressed is the extent to which we speak of intentions when the observed individual's behavior is contingency shaped or under instructional control. PMID:22478417
Physical interpretation of antigravity
NASA Astrophysics Data System (ADS)
Bars, Itzhak; James, Albin
2016-02-01
Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl-invariant Standard Model coupled to general relativity (SM +GR ), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated, but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and out signals that interact with the antigravity region. This is no different from a spacetime black box for which the information about its interior is encoded in scattering amplitudes for in/out states at its exterior. Through examples we show that negative kinetic energy in antigravity presents no problems of principles but is an interesting topic for physical investigations of fundamental significance.
Monitoring and interpreting bioremediation effectiveness
Bragg, J.R.; Prince, R.C.; Harner, J.; Atlas, R.M.
1993-12-31
Following the Exxon Valdez oil spill in 1989, extensive research was conducted by the US Environments Protection Agency and Exxon to develop and implement bioremediation techniques for oil spill cleanup. A key challenge of this program was to develop effective methods for monitoring and interpreting bioremediation effectiveness on extremely heterogenous intertidal shorelines. Fertilizers were applied to shorelines at concentrations known to be safe, and effectiveness achieved in acceleration biodegradation of oil residues was measure using several techniques. This paper describes the most definitive method identified, which monitors biodegradation loss by measuring changes in ratios of hydrocarbons to hopane, a cycloalkane present in the oil that showed no measurable degradation. Rates of loss measured by the hopane ratio method have high levels of statistical confidence, and show that the fertilizer addition stimulated biodegradation rates as much a fivefold. Multiple regression analyses of data show that fertilizer addition of nitrogen in interstitial pore water per unit of oil load was the most important parameter affecting biodegradation rate, and results suggest that monitoring nitrogen concentrations in the subsurface pore water is preferred technique for determining fertilizer dosage and reapplication frequency.
Enhancing the Teaching of Statistics: Portfolio Theory, an Application of Statistics in Finance
ERIC Educational Resources Information Center
Christou, Nicolas
2008-01-01
In this paper we present an application of statistics using real stock market data. Most, if not all, students have some familiarity with the stock market (or at least they have heard about it) and therefore can understand the problem easily. It is the real data analysis that students find interesting. Here we explore the building of efficient…
Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
Reverse Causation and the Transactional Interpretation of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Cramer, John G.
2006-10-01
In the first part of the paper we present the transactional interpretation of quantum mechanics, a method of viewing the formalism of quantum mechanics that provides a way of visualizing quantum events and experiments. In the second part, we present an EPR gedankenexperiment that appears to lead to observer-level reverse causation. A transactional analysis of the experiment is presented. It easily accounts for the reported observations but does not reveal any barriers to its modification for reverse causation.
On easily tunable wide-bandpass X-ray monochromators based on refraction in arrays of prisms.
Jark, Werner
2012-07-01
Refractive lenses focus X-rays chromatically owing to a significant variation of the refractive index of the lens material with photon energy. Then, in combination with an exit slit in the focal plane, such lenses can be used as monochromators. The spectral resolution obtainable with refractive lenses based on prism arrays was recently systematically investigated experimentally. This contribution will show that a wide-bandpass performance can be predicted with a rather simple analytical approach. Based on the good agreement with the experimental data, one can then more rapidly and systematically optimize the lens structure for a given application. This contribution will then discuss more flexible solutions for the monochromator operation. It will be shown that a new monochromator scheme could easily provide tuning in a fixed-exit slit. PMID:22713879
NASA Astrophysics Data System (ADS)
Biass, Sébastien; Frischknecht, Corine; Dell'Oro, Luca; Senegas, Olivier; Bonadonna, Costanza
2010-05-01
In order to answer the needs of contingency planning, we present a GIS-based method for risk assessment of tephra deposits, which is flexible enough to work with datasets of variable precision and resolution depending on data availabilty. Due to the constant increase of population density around volcanoes and the large dispersal of tephra from volcanic plumes, a wide range of threats such as roof collapses, destruction of crops, blockage of vital lifelines and health problems concern even remote communities. In the field of disaster management, there is a general agreement that a global and incomplete method, subject to revision and improvements, is better than no information at all. In this framework, our method is able to provide fast and rough insights on possible eruptive scenarios and their potential consequences on surrounding populations with only few available data, which can easily be refined later. Therefore, the knowledge of both the expected hazard (frequency and magnitude) and the vulnerability of elements at risk are required by planners in order to produce efficient emergency planning prior to a crisis. The Cotopaxi volcano, one of Ecuador's most active volcanoes, was used to develop and test this method. Cotopaxi volcano is located 60 km south of Quito and threatens a highly populated valley. Based on field data, historical reports and the Smithsonian catalogue, our hazard assessment was carried out using the numerical model TEPHRA2. We first applied a deterministic approach that evolved towards a fully probabilistic method in order to account for the most likely eruptive scenarios as well as the variability of atmospheric conditions. In parallel, we carried out a vulnerability assessment of the physical (crops and roofs), social (populations) and systemic elements-at-risk by using mainly free and easily accessible data. Both hazard and vulnerability assessments were compiled with GIS tools to draw comprehensive and tangible thematic risk maps
The flow of interpretation. The collateral interpretation, force and flow.
Duncan, D
1989-01-01
This paper was presented to a Conference on the theme 'The Formulation of Interpretations in Clinical Practice'. It suggests that, impressionistically in line with the identification of psychoanalysis with natural science, an unconscious metaphor which sees interpretation as something like a force inserted on a physical particle has been more influential conceptually than the unconscious metaphor naturally complementary to it, that of interpretation as something like a liquid in flow. The concept of 'the collateral interpretation' is introduced. Loosely speaking, this is what an analyst thinks he would interpret at any given moment. It is tentative, unformed, and changes kaleidoscopically. It accommodates psychoanalytic concepts. It is suggested that examination of the mode of operation of 'the collateral interpretation' is important in understanding the formulation of interpretations. A single session is used for clinical illustration. PMID:2606603
A History of Oral Interpretation.
ERIC Educational Resources Information Center
Bahn, Eugene; Bahn, Margaret L.
This historical account of the oral interpretation of literature establishes a chain of events comprehending 25 centuries of verbal tradition from the Homeric Age through 20th Century America. It deals in each era with the viewpoints and contributions of major historical figures to oral interpretation, as well as with oral interpretation's…
Predict! Teaching Statistics Using Informational Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
Statistics Poker: Reinforcing Basic Statistical Concepts
ERIC Educational Resources Information Center
Leech, Nancy L.
2008-01-01
Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…
NASA Astrophysics Data System (ADS)
Dong, Qingqing; Wang, Zhaowei; Zhang, Kaicheng; Yu, Hao; Huang, Peng; Liu, Xiaodong; Zhou, Yi; Chen, Ning; Song, Bo
2016-03-01
For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs.For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs. Electronic supplementary information (ESI) available: J-V curves & characteristics
Appropriate use of medical interpreters.
Juckett, Gregory; Unger, Kendra
2014-10-01
More than 25 million Americans speak English "less than very well," according to the U.S. Census Bureau. This population is less able to access health care and is at higher risk of adverse outcomes such as drug complications and decreased patient satisfaction. Title VI of the Civil Rights Act mandates that interpreter services be provided for patients with limited English proficiency who need this service, despite the lack of reimbursement in most states. Professional interpreters are superior to the usual practice of using ad hoc interpreters (i.e., family, friends, or untrained staff). Untrained interpreters are more likely to make errors, violate confidentiality, and increase the risk of poor outcomes. Children should never be used as interpreters except in emergencies. When using an interpreter, the clinician should address the patient directly and seat the interpreter next to or slightly behind the patient. Statements should be short, and the discussion should be limited to three major points. In addition to acting as a conduit for the discussion, the interpreter may serve as a cultural liaison between the physician and patient. When a bilingual clinician or a professional interpreter is not available, phone interpretation services or trained bilingual staff members are reasonable alternatives. The use of professional interpreters (in person or via telephone) increases patient satisfaction, improves adherence and outcomes, and reduces adverse events, thus limiting malpractice risk. PMID:25369625
Lau, Clara B S; Cheng, Ling; Cheng, Bobby W H; Yue, Grace G L; Wong, Eric C W; Lau, Ching-Po; Leung, Ping-Chung; Fung, Kwok-Pui
2012-01-01
Hedyotis diffusa Willd. and Hedyotis corymbosa (L.) Lam. are closely related species of Rubiaceae family and they can be easily confused. Although previous reports have been found in which ultraviolet spectrum, convolution spectrometry or X-ray diffraction are reported to be used for distinguishing between the two species, these methods require specialised equipment. Hence, this study aims to develop a simple chromatographic method for the purpose. Our results illustrate the use of a thin-layer chromatographic (TLC) profile to differentiate between the two species, with a blue zone appearing at around an R(f) of 0.36 in H. corymbosa but not in H. diffusa. The compound corresponding to this blue zone was later found to be hedyotiscone A. LC-MS with multiple reaction monitoring was used as a tool to identify and quantify hedyotiscone A in the test samples. In conclusion, a quick and simple TLC assay was conducted to distinguish between the two species H. diffusa and H. corymbosa. PMID:21988612
Kundrapu, Sirisha; Jury, Lucy A.; Sitzlar, Brett; Sunkesula, Venkata C. K.; Sethi, Ajay K.
2013-01-01
Although rapid laboratory tests are available for diagnosis of Clostridium difficile infection (CDI), delays in completion of CDI testing are common in clinical practice. We conducted a cohort study of 242 inpatients tested for CDI to determine the timing of different steps involved in diagnostic testing and to identify modifiable factors contributing to delays in diagnosis. The average time from test order to test result was 1.8 days (range, 0.2 to 10.6), with time from order to stool collection accounting for most of the delay (mean, 1.0 day; range, 0 to 10). Several modifiable factors contributed to delays, including not providing stool collection supplies to patients in a timely fashion, rejection of specimens due to incorrect labeling or leaking from the container, and holding samples in the laboratory for batch processing. Delays in testing contributed to delays in initiation of treatment for patients diagnosed with CDI and to frequent prescription of empirical CDI therapy for patients with mild to moderate symptoms whose testing was ultimately negative. An intervention that addressed several easily modified factors contributing to delays resulted in a significant decrease in the time required to complete CDI testing. These findings suggest that health care facilities may benefit from a review of their processes for CDI testing to identify and address modifiable factors that contribute to delays in diagnosis and treatment of CDI. PMID:23678072
NASA Astrophysics Data System (ADS)
Mesci, Gunkut; Schwartz, Renee'S.
2016-02-01
The purpose of this study was to assess preservice teachers' views of Nature of Science (NOS), identify aspects that were challenging for conceptual change, and explore reasons why. This study particularly focused on why and how some concepts of NOS may be more easily altered than others. Fourteen preservice science teachers enrolled in a NOS and Science Inquiry course participated in this study. Data were collected by using a pre/post format with the Views of Nature of Science questionnaire (VNOS-270), the Views of Scientific Inquiry questionnaire (VOSI-270), follow-up interviews, and classroom artifacts. The results indicated that most students initially held naïve views about certain aspects of NOS like tentativeness and subjectivity. By the end of the semester, almost all students dramatically improved their understanding about almost all aspects of NOS. However, several students still struggled with certain aspects like the differences between scientific theory and law, tentativeness, and socio-cultural embeddedness. Results suggested that instructional, motivational, and socio-cultural factors may influence if and how students changed their views about targeted NOS aspects. Students thought that classroom activities, discussions, and readings were most helpful to improve their views about NOS. The findings from the research have the potential to translate as practical advice for teachers, science educators, and future researchers.
Rotstein, Benjamin H; Liang, Steven H; Placzek, Michael S; Hooker, Jacob M; Gee, Antony D; Dollé, Frédéric; Wilson, Alan A; Vasdev, Neil
2016-08-22
The positron-emitting radionuclide carbon-11 ((11)C, t1/2 = 20.3 min) possesses the unique potential for radiolabeling of any biological, naturally occurring, or synthetic organic molecule for in vivo positron emission tomography (PET) imaging. Carbon-11 is most often incorporated into small molecules by methylation of alcohol, thiol, amine or carboxylic acid precursors using [(11)C]methyl iodide or [(11)C]methyl triflate (generated from [(11)C]carbon dioxide or [(11)C]methane). Consequently, small molecules that lack an easily substituted (11)C-methyl group are often considered to have non-obvious strategies for radiolabeling and require a more customized approach. [(11)C]Carbon dioxide itself, [(11)C]carbon monoxide, [(11)C]cyanide, and [(11)C]phosgene represent alternative reactants to enable (11)C-carbonylation. Methodologies developed for preparation of (11)C-carbonyl groups have had a tremendous impact on the development of novel PET tracers and provided key tools for clinical research. (11)C-Carbonyl radiopharmaceuticals based on labeled carboxylic acids, amides, carbamates and ureas now account for a substantial number of important imaging agents that have seen translation to higher species and clinical research of previously inaccessible targets, which is a testament to the creativity, utility and practicality of the underlying radiochemistry. PMID:27276357
Ding, Yaobin; Tang, Hebin; Zhang, Shenghua; Wang, Songbo; Tang, Heqing
2016-11-01
Microscaled CuFeO2 particles (micro-CuFeO2) were rapidly prepared via a microwave-assisted hydrothermal method and characterized by scanning electron microscopy, X-ray powder diffraction and X-ray photoelectron spectroscopy. It was found that the micro-CuFeO2 was of pure phase and a rhombohedral structure with size in the range of 2.8±0.6μm. The micro-CuFeO2 efficiently catalyzed the activation of peroxymonosulfate (PMS) to generate sulfate radicals (SO4-), causing the fast degradation of carbamazepine (CBZ). The catalytic activity of micro-CuFeO2 was observed to be 6.9 and 25.3 times that of micro-Cu2O and micro-Fe2O3, respectively. The enhanced activity of micro-CuFeO2 for the activation of PMS was confirmed to be attributed to synergistic effect of surface bonded Cu(I) and Fe(III). Sulfate radical was the primary radical species responsible for the CBZ degradation. As a microscaled catalyst, micro-CuFeO2 can be easily recovered by gravity settlement and exhibited improved catalytic stability compared with micro-Cu2O during five successive degradation cycles. Oxidative degradation of CBZ by the couple of PMS/CuFeO2 was effective in the studied actual aqueous environmental systems. PMID:27329789
Applications of Statistical Tests in Hand Surgery
Song, Jae W.; Haas, Ann; Chung, Kevin C.
2015-01-01
During the nineteenth century, with the emergence of public health as a goal to improve hygiene and conditions of the poor, statistics established itself as a distinct scientific field important for critically interpreting studies of public health concerns. During the twentieth century, statistics began to evolve mathematically and methodologically with hypothesis testing and experimental design. Today, the design of medical experiments centers around clinical trials and observational studies, and with the use of statistics, the collected data are summarized, weighed, and presented to direct both physicians and the public towards Evidence-Based Medicine. Having a basic understanding of statistics is mandatory in evaluating the validity of published literature and applying it to patient care. In this review, we aim to apply a practical approach in discussing basic statistical tests by providing a guide to choosing the correct statistical test along with examples relevant to hand surgery research. PMID:19969193
A programmed labeling approach to image interpretation
NASA Technical Reports Server (NTRS)
Pore, M. D.; Abotteen, R. A. (Principal Investigator)
1979-01-01
Manual labeling techniques require the analyst-interpreter to use not only production film converter products but also agricultural and meteorological data and spectral aids in an integrated, judgmental fashion. To control an anticipated high variance in these techniques, a semiautomatic labeling technology was developed. The product of this technology is label identification from statistical tabulation (LIST) which operates from a discriminant basis and has the ability to measure the reliability of the label and to introduce an arbitrary bias. The development of LIST and its properties are described. Numerical results of an application are included and the evaluation of LIST is discussed.
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor - Statistics Request Permissions Neuroendocrine Tumor - Statistics Approved by the Cancer.Net Editorial Board , 04/ ... the body. It is important to remember that statistics on how many people survive this type of ...
Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J. Eloy; Barsukov, Pavel; Bárta, Jiří; Čapek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Šantrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas
2014-01-01
Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM (“priming effect”). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze–thaw processes) to additions of 13C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased
Foolad, Mahsa; Ong, Say Leong; Hu, Jiangyong
2015-11-01
Pharmaceutical and personal care products (PPCPs) and artificial sweeteners (ASs) are emerging organic contaminants (EOCs) in the aquatic environment. The presence of PPCPs and ASs in water bodies has an ecologic potential risk and health concern. Therefore, it is needed to detect the pollution sources by understanding the transport behavior of sewage molecular markers in a subsurface area. The aim of this study was to evaluate transport of nine selected molecular markers through saturated soil column experiments. The selected sewage molecular markers in this study were six PPCPs including acetaminophen (ACT), carbamazepine (CBZ), caffeine (CF), crotamiton (CTMT), diethyltoluamide (DEET), salicylic acid (SA) and three ASs including acesulfame (ACF), cyclamate (CYC), and saccharine (SAC). Results confirmed that ACF, CBZ, CTMT, CYC and SAC were suitable to be used as sewage molecular markers since they were almost stable against sorption and biodegradation process during soil column experiments. In contrast, transport of ACT, CF and DEET were limited by both sorption and biodegradation processes and 100% removal efficiency was achieved in the biotic column. Moreover, in this study the effect of different acetate concentration (0-100mg/L) as an easily biodegradable primary substrate on a removal of PPCPs and ASs was also studied. Results showed a negative correlation (r(2)>0.75) between the removal of some selected sewage chemical markers including ACF, CF, ACT, CYC, SAC and acetate concentration. CTMT also decreased with the addition of acetate, but increasing acetate concentration did not affect on its removal. CBZ and DEET removal were not dependent on the presence of acetate. PMID:26210019
Fundamentals of interpretation in echocardiography
Harrigan, P.; Lee, R.M.
1985-01-01
This illustrated book provides familiarity with the many clinical, physical, and electronic factors that bear on echocardiographic interpretation. Physical and clinical principles are integrated with considerations of anatomy and physiology to address interpretive problems. This approach yields, for example, sections on the physics and electronics of M-mode, cross sectional, and Doppler systems which are informal, full of echocardiagrams, virtually devoid of mathematics, and rigorously related to common issues faced by echocardiograph interpreters.
NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site]
Released July 22, 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth.
Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms.
It is often difficult to determine if wind eroded surface represent the youngest activity in a region. Wind eroded landforms can be covered by later materials and the exhumed long after they were initially formed. This image illustrates how difficult it can be to interpret the surface of Mars.
Image information: VIS instrument. Latitude -6.7, Longitude 174.7 East (185.3 West). 19 meter/pixel resolution.
Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.
NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at
EXPERIMENTAL DESIGN: STATISTICAL CONSIDERATIONS AND ANALYSIS
Technology Transfer Automated Retrieval System (TEKTRAN)
In this book chapter, information on how field experiments in invertebrate pathology are designed and the data collected, analyzed, and interpreted is presented. The practical and statistical issues that need to be considered and the rationale and assumptions behind different designs or procedures ...
Marano, Grazia; Gronewold, Claas; Frank, Martin; Merling, Anette; Kliem, Christian; Sauer, Sandra; Wiessler, Manfred; Frei, Eva
2012-01-01
Summary Oligosaccharides aberrantly expressed on tumor cells influence processes such as cell adhesion and modulation of the cell’s microenvironment resulting in an increased malignancy. Schmidt’s imidate strategy offers an effective method to synthesize libraries of various oligosaccharide mimetics. With the aim to perturb interactions of tumor cells with extracellular matrix proteins and host cells, molecules with 3,4-bis(hydroxymethyl)furan as core structure were synthesized and screened in biological assays for their abilities to interfere in cell adhesion and other steps of the metastatic cascade, such as tumor-induced angiogenesis. The most active compound, (4-{[(β-D-galactopyranosyl)oxy]methyl}furan-3-yl)methyl hydrogen sulfate (GSF), inhibited the activation of matrix-metalloproteinase-2 (MMP-2) as well as migration of the human melanoma cells of the lines WM-115 and WM-266-4 in a two-dimensional migration assay. GSF inhibited completely the adhesion of WM-115 cells to the extracellular matrix (ECM) proteins, fibrinogen and fibronectin. In an in vitro angiogenesis assay with human endothelial cells, GSF very effectively inhibited endothelial tubule formation and sprouting of blood vessels, as well as the adhesion of endothelial cells to ECM proteins. GSF was not cytotoxic at biologically active concentrations; neither were 3,4-bis{[(β-D-galactopyranosyl)oxy]methyl}furan (BGF) nor methyl β-D-galactopyranoside nor 3,4-bis(hydroxymethyl)furan, which were used as controls, eliciting comparable biological activity. In silico modeling experiments, in which binding of GSF to the extracellular domain of the integrin αvβ3 was determined, revealed specific docking of GSF to the same binding site as the natural peptidic ligands of this integrin. The sulfate in the molecule coordinated with one manganese ion in the binding site. These studies show that this chemically easily accessible molecule GSF, synthesized in three steps from 3,4-bis
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
Statistical templates for visual search.
Ackermann, John F; Landy, Michael S
2014-01-01
How do we find a target embedded in a scene? Within the framework of signal detection theory, this task is carried out by comparing each region of the scene with a "template," i.e., an internal representation of the search target. Here we ask what form this representation takes when the search target is a complex image with uncertain orientation. We examine three possible representations. The first is the matched filter. Such a representation cannot account for the ease with which humans can find a complex search target that is rotated relative to the template. A second representation attempts to deal with this by estimating the relative orientation of target and match and rotating the intensity-based template. No intensity-based template, however, can account for the ability to easily locate targets that are defined categorically and not in terms of a specific arrangement of pixels. Thus, we define a third template that represents the target in terms of image statistics rather than pixel intensities. Subjects performed a two-alternative, forced-choice search task in which they had to localize an image that matched a previously viewed target. Target images were texture patches. In one condition, match images were the same image as the target and distractors were a different image of the same textured material. In the second condition, the match image was of the same texture as the target (but different pixels) and the distractor was an image of a different texture. Match and distractor stimuli were randomly rotated relative to the target. We compared human performance to pixel-based, pixel-based with rotation, and statistic-based search models. The statistic-based search model was most successful at matching human performance. We conclude that humans use summary statistics to search for complex visual targets. PMID:24627458
Interpreting Recoil for Undergraduate Students
ERIC Educational Resources Information Center
Elsayed, Tarek A.
2012-01-01
The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is…
Interpretation Tasks for Grammar Teaching.
ERIC Educational Resources Information Center
Ellis, Rod
1995-01-01
The traditional approach to grammar teaching provides learners with opportunities to produce specific grammatical structures. This article explores an alternative approach, one based on interpreting input. The rationale for the approach is discussed, as are the principles for designing interpretation tasks for grammar teaching. (Contains 35…
Remote sensing and image interpretation
NASA Technical Reports Server (NTRS)
Lillesand, T. M.; Kiefer, R. W. (Principal Investigator)
1979-01-01
A textbook prepared primarily for use in introductory courses in remote sensing is presented. Topics covered include concepts and foundations of remote sensing; elements of photographic systems; introduction to airphoto interpretation; airphoto interpretation for terrain evaluation; photogrammetry; radiometric characteristics of aerial photographs; aerial thermography; multispectral scanning and spectral pattern recognition; microwave sensing; and remote sensing from space.
Educators' Interpretations of Ambiguous Accommodations
ERIC Educational Resources Information Center
Byrnes, MaryAnn
2008-01-01
This exploratory case study examined how general and special education teachers in one school district interpreted three frequently used accommodations. Although a majority of both groups agreed on interpretations of extended time, there was little agreement, considerable variation, and some contradiction in their understanding of the changes…
Oral Interpretation as Performing Performance.
ERIC Educational Resources Information Center
Peterson, Eric E.
A three-step process of description, reduction, and interpretation is employed in this paper in disentangling the complex of relationships involved in oral interpretation. In the description, contributions from various disciplines are synthesized; among the topics discussed are the communication process model usually employed in descriptions of…
Kutchinsky, B
1991-01-01
from extreme scarcity to relative abundance. If (violent) pornography causes rape, this exceptional development in the availability of (violent) pornography should definitely somehow influence the rape statistics. Since, however, the rape figures could not simply be expected to remain steady during the period in question (when it is well known that most other crimes increased considerably), the development of rape rates was compared with that of non-sexual violent offences and nonviolent sexual offences (in so far as available statistics permitted). The results showed that in none of the countries did rape increase more than nonsexual violent crimes. This finding in itself would seem sufficient to discard the hypothesis that pornography causes rape.(ABSTRACT TRUNCATED AT 400 WORDS) PMID:2032762
Analysis and Interpretation of Findings Using Multiple Regression Techniques
ERIC Educational Resources Information Center
Hoyt, William T.; Leierer, Stephen; Millington, Michael J.
2006-01-01
Multiple regression and correlation (MRC) methods form a flexible family of statistical techniques that can address a wide variety of different types of research questions of interest to rehabilitation professionals. In this article, we review basic concepts and terms, with an emphasis on interpretation of findings relevant to research questions…
Philosophical perspectives on quantum chaos: Models and interpretations
NASA Astrophysics Data System (ADS)
Bokulich, Alisa Nicole
2001-09-01
The problem of quantum chaos is a special case of the larger problem of understanding how the classical world emerges from quantum mechanics. While we have learned that chaos is pervasive in classical systems, it appears to be almost entirely absent in quantum systems. The aim of this dissertation is to determine what implications the interpretation of quantum mechanics has for attempts to explain the emergence of classical chaos. There are three interpretations of quantum mechanics that have set out programs for solving the problem of quantum chaos: the standard interpretation, the statistical interpretation, and the deBroglie-Bohm causal interpretation. One of the main conclusions of this dissertation is that an interpretation alone is insufficient for solving the problem of quantum chaos and that the phenomenon of decoherence must be taken into account. Although a completely satisfactory solution of the problem of quantum chaos is still outstanding, I argue that the deBroglie-Bohm interpretation with the help of decoherence outlines the most promising research program to pursue. In addition to making a contribution to the debate in the philosophy of physics concerning the interpretation of quantum mechanics, this dissertation reveals two important methodological lessons for the philosophy of science. First, issues of reductionism and intertheoretic relations cannot be divorced from questions concerning the interpretation of the theories involved. Not only is the exploration of intertheoretic relations a central part of the articulation and interpretation of an individual theory, but the very terms used to discuss intertheoretic relations, such as `state' and `classical limit', are themselves defined by particular interpretations of the theory. The second lesson that emerges is that, when it comes to characterizing the relationship between classical chaos and quantum mechanics, the traditional approaches to intertheoretic relations, namely reductionism and
Dertinger, Stephen D.; Avlasevich, Svetlana L.; Bemis, Jeffrey C.; Chen, Yuhchyau; MacGregor, James T.
2015-01-01
This laboratory has previously described a method for scoring the incidence of rodent blood Pig-a mutant phenotype erythrocytes using immunomag-netic separation in conjunction with flow cytometric analysis (In Vivo MutaFlow®). The current work extends this approach to human blood. The frequencies of CD59- and CD55-negative reticulo-cytes (RETCD59−/CD55−) and erythrocytes (RBCCD59−/CD55−) seve as phenotypic reporters of PIG-A gene mutation. Immunomagnetic separation was found to provide an effective means of increasing the number of reticulocytes and erythro-cytes evaluated. Technical replicates were utilized to provide a sufficient number of cells for precise scoring while at the same time controlling for procedural accuracy by allowing comparison of replicate values. Cold whole blood samples could be held for at least one week without affecting reticulo-cyte, RETCD59−/CD55− or RBCCD59−/CD55− frequencies. Specimens from a total of 52 nonsmoking, self-reported healthy adult subjects were evaluated. The mean frequency of RETCD59−/CD55− and RBCCD592−/CD55− were 6.0 × 10−6 and 2.9 × 10−6, respectively. The difference is consistent with a modest selective pressure against mutant phenotype erythrocytes in the circulation, and suggests advantages of studying both populations of erythrocytes. Whereas intra-subject variability was low, inter-subject variability was relatively high, with RETCD59−/CD55− frequencies differing by more than 30-fold. There was an apparent correlation between age and mutant cell frequencies. Taken together, the results indicate that the frequency of human PIG-A mutant phenotype cells can be efficiently and reliably estimated using a labeling and analysis protocol that is well established for rodent-based studies. The applicability of the assay across species, its simplicity and statistical power, and the relatively non-invasive nature of the assay should benefit myriad research areas involving DNA damage
ABSTRACT: Total Petroleum hydrocarbons (TPH) as a lumped parameter can be easily and rapidly measured or monitored. Despite interpretational problems, it has become an accepted regulatory benchmark used widely to evaluate the extent of petroleum product contamination. Three cu...
ENVIRONMENTAL PHOTOGRAPHIC INTERPRETATION CENTER (EPIC)
The Environmental Sciences Division (ESD) in the National Exposure Research Laboratory (NERL) of the Office of Research and Development provides remote sensing technical support including aerial photograph acquisition and interpretation to the EPA Program Offices, ORD Laboratorie...
Interpreting Results from Multiscore Batteries.
ERIC Educational Resources Information Center
Anastasi, Anne
1985-01-01
Describes the role of information on score reliabilities, significance of score differences, intercorrelations of scores, and differential validity of score patterns on the interpretation of results from multiscore batteries. (Author)
... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...
Mathematical and statistical analysis
NASA Technical Reports Server (NTRS)
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
Environmental statistics and optimal regulation
NASA Astrophysics Data System (ADS)
Sivak, David; Thomson, Matt
2015-03-01
The precision with which an organism can detect its environment, and the timescale for and statistics of environmental change, will affect the suitability of different strategies for regulating protein levels in response to environmental inputs. We propose a general framework--here applied to the enzymatic regulation of metabolism in response to changing nutrient concentrations--to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, and the costs associated with enzyme production. We find: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.
Statistical analysis of planetary surfaces
NASA Astrophysics Data System (ADS)
Schmidt, Frederic; Landais, Francois; Lovejoy, Shaun
2015-04-01
In the last decades, a huge amount of topographic data has been obtained by several techniques (laser and radar altimetry, DTM…) for different bodies in the solar system, including Earth, Mars, the Moon etc.. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeter to thousands of kilometers. This complexity seems to prohibit global descriptions or global topography models. Nevertheless, this topographic complexity is well-known to exhibit scaling laws that establish a similarity between scales and permit simpler descriptions and models. Indeed, efficient simulations can be made using the statistical properties of scaling fields (fractals). But realistic simulations of global topographic fields must be multi (not mono) scaling behaviour, reflecting the extreme variability and intermittency observed in real fields that can not be generated by simple scaling models. A multiscaling theory has been developed in order to model high variability and intermittency. This theory is a good statistical candidate to model the topography field with a limited number of parameters (called the multifractal parameters). In our study, we show that statistical properties of the Martian topography is accurately reproduced by this model, leading to new interpretation of geomorphological processes.
Students' Interpretation of a Function Associated with a Real-Life Problem from Its Graph
ERIC Educational Resources Information Center
Mahir, Nevin
2010-01-01
The properties of a function such as limit, continuity, derivative, growth, or concavity can be determined more easily from its graph than by doing any algebraic operation. For this reason, it is important for students of mathematics to interpret some of the properties of a function from its graph. In this study, we investigated the competence of…
Interpreter services in emergency medicine.
Chan, Yu-Feng; Alagappan, Kumar; Rella, Joseph; Bentley, Suzanne; Soto-Greene, Marie; Martin, Marcus
2010-02-01
Emergency physicians are routinely confronted with problems associated with language barriers. It is important for emergency health care providers and the health system to strive for cultural competency when communicating with members of an increasingly diverse society. Possible solutions that can be implemented include appropriate staffing, use of new technology, and efforts to develop new kinds of ties to the community served. Linguistically specific solutions include professional interpretation, telephone interpretation, the use of multilingual staff members, the use of ad hoc interpreters, and, more recently, the use of mobile computer technology at the bedside. Each of these methods carries a specific set of advantages and disadvantages. Although professionally trained medical interpreters offer improved communication, improved patient satisfaction, and overall cost savings, they are often underutilized due to their perceived inefficiency and the inconclusive results of their effect on patient care outcomes. Ultimately, the best solution for each emergency department will vary depending on the population served and available resources. Access to the multiple interpretation options outlined above and solid support and commitment from hospital institutions are necessary to provide proper and culturally competent care for patients. Appropriate communications inclusive of interpreter services are essential for culturally and linguistically competent provider/health systems and overall improved patient care and satisfaction. PMID:18571358
Minnesota Health Statistics 1988.
ERIC Educational Resources Information Center
Minnesota State Dept. of Health, St. Paul.
This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…
ERIC Educational Resources Information Center
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
ERIC Educational Resources Information Center
Strasser, Nora
2007-01-01
Avoiding statistical mistakes is important for educators at all levels. Basic concepts will help you to avoid making mistakes using statistics and to look at data with a critical eye. Statistical data is used at educational institutions for many purposes. It can be used to support budget requests, changes in educational philosophy, changes to…
Statistical quality management
NASA Astrophysics Data System (ADS)
Vanderlaan, Paul
1992-10-01
Some aspects of statistical quality management are discussed. Quality has to be defined as a concrete, measurable quantity. The concepts of Total Quality Management (TQM), Statistical Process Control (SPC), and inspection are explained. In most cases SPC is better than inspection. It can be concluded that statistics has great possibilities in the field of TQM.
What Is the next Trend in Usage Statistics in Libraries?
ERIC Educational Resources Information Center
King, Douglas
2009-01-01
In answering the question "What is the next trend in usage statistics in libraries?" an eclectic group of respondents has presented an assortment of possibilities, suggestions, complaints and, of course, questions of their own. Undoubtedly, usage statistics collection, interpretation, and application are areas of growth and increasing complexity…
Faculty Salary Equity Cases: Combining Statistics with the Law
ERIC Educational Resources Information Center
Luna, Andrew L.
2006-01-01
Researchers have used many statistical models to determine whether an institution's faculty pay structure is equitable, with varying degrees of success. Little attention, however, has been given to court interpretations of statistical significance or to what variables courts have acknowledged should be used in an equity model. This article…
Chi-Square Statistics, Tests of Hypothesis and Technology.
ERIC Educational Resources Information Center
Rochowicz, John A.
The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…
ALISE Library and Information Science Education Statistical Report, 1999.
ERIC Educational Resources Information Center
Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.
This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by…
The Effect Size Statistic: Overview of Various Choices.
ERIC Educational Resources Information Center
Mahadevan, Lakshmi
Over the years, methodologists have been recommending that researchers use magnitude of effect estimates in result interpretation to highlight the distinction between statistical and practical significance (cf. R. Kirk, 1996). A magnitude of effect statistic (i.e., effect size) tells to what degree the dependent variable can be controlled,…
Securing wide appreciation of health statistics
Pyrrait, A. M. DO Amaral; Aubenque, M. J.; Benjamin, B.; DE Groot, Meindert J. W.; Kohn, R.
1954-01-01
All the authors are agreed on the need for a certain publicizing of health statistics, but do Amaral Pyrrait points out that the medical profession prefers to convince itself rather than to be convinced. While there is great utility in articles and reviews in the professional press (especially for paramedical personnel) Aubenque, de Groot, and Kohn show how appreciation can effectively be secured by making statistics more easily understandable to the non-expert by, for instance, including readable commentaries in official publications, simplifying charts and tables, and preparing simple manuals on statistical methods. Aubenque and Kohn also stress the importance of linking health statistics to other economic and social information. Benjamin suggests that the principles of market research could to advantage be applied to health statistics to determine the precise needs of the “consumers”. At the same time, Aubenque points out that the value of the ultimate results must be clear to those who provide the data; for this, Kohn suggests that the enumerators must know exactly what is wanted and why. There is general agreement that some explanation of statistical methods and their uses should be given in the curricula of medical schools and that lectures and postgraduate courses should be arranged for practising physicians. PMID:13199668
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science. PMID:27231259
An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics
ERIC Educational Resources Information Center
Ellis, Frank B.; Ellis, David C.
2008-01-01
Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…
Advice on statistical analysis for Circulation Research.
Kusuoka, Hideo; Hoffman, Julien I E
2002-10-18
Since the late 1970s when many journals published articles warning about the misuse of statistical methods in the analysis of data, researchers have become more careful about statistical analysis, but errors including low statistical power and inadequate analysis of repeated-measurement studies are still prevalent. In this review, several statistical methods are introduced that are not always familiar to basic and clinical cardiologists but may be useful for revealing the correct answer from the data. The aim of this review is not only to draw the attention of investigators to these tests but also to stress the conditions in which they are applicable. These methods are now generally available in statistical program packages. Researchers need not know how to calculate the statistics from the data but are required to select the correct method from the menu and interpret the statistical results accurately. With the choice of appropriate statistical programs, the issue is no longer how to do the test but when to do it. PMID:12386142
ERIC Educational Resources Information Center
Sotos, Ana Elisa Castro; Vanhoof, Stijn; Van den Noortgate, Wim; Onghena, Patrick
2007-01-01
A solid understanding of "inferential statistics" is of major importance for designing and interpreting empirical results in any scientific discipline. However, students are prone to many misconceptions regarding this topic. This article structurally summarizes and describes these misconceptions by presenting a systematic review of publications…
Intelligent Collection Environment for an Interpretation System
Maurer, W J
2001-07-19
An Intelligent Collection Environment for a data interpretation system is described. The environment accepts two inputs: A data model and a number between 0.0 and 1.0. The data model is as simple as a single word or as complex as a multi-level/multidimensional model. The number between 0.0 and 1.0 is a control knob to indicate the user's desire to allow loose matching of the data (things are ambiguous and unknown) versus strict matching of the data (things are precise and known). The environment produces a set of possible interpretations, a set of requirements to further strengthen or to differentiate a particular subset of the possible interpretation from the others, a set of inconsistencies, and a logic map that graphically shows the lines of reasoning used to derive the above output. The environment is comprised of a knowledge editor, model explorer, expertise server, and the World Wide Web. The Knowledge Editor is used by a subject matter expert to define Linguistic Types, Term Sets, detailed explanations, and dynamically created URI's, and to create rule bases using a straight forward hyper matrix representation. The Model Explorer allows rapid construction and browsing of multi-level models. A multi-level model is a model whose elements may also be models themselves. The Expertise Server is an inference engine used to interpret the data submitted. It incorporates a semantic network knowledge representation, an assumption based truth maintenance system, and a fuzzy logic calculus. It can be extended by employing any classifier (e.g. statistical/neural networks) of complex data types. The World Wide Web is an unstructured data space accessed by the URI's supplied as part of the output of the environment. By recognizing the input data model as a query, the environment serves as a deductive search engine. Applications include (but are not limited to) interpretation of geophysical phenomena, a navigation aid for very large web sites, monitoring of computer or
Interpretational Confounding or Confounded Interpretations of Causal Indicators?
Bainter, Sierra A.; Bollen, Kenneth A.
2014-01-01
In measurement theory causal indicators are controversial and little-understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning intended by a researcher. This article questions the validity of evidence used to claim that causal indicators are inherently susceptible to interpretational confounding. Further, a simulation study demonstrates that causal indicator coefficients are stable across correctly-specified models. Determining the suitability of causal indicators has implications for the way we conceptualize measurement and build and evaluate measurement models. PMID:25530730
Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine
2016-01-01
We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466
Hébert-Dufresne, Laurent; Grochow, Joshua A; Allard, Antoine
2016-01-01
We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks. PMID:27535466
Facade Interpretation Using a Marked Point Process
NASA Astrophysics Data System (ADS)
Wenzel, Susanne; Förstner, Wolfgang
2016-06-01
Our objective is the interpretation of facade images in a top-down manner, using a Markov marked point process formulated as a Gibbs process. Given single rectified facade images, we aim at the accurate detection of relevant facade objects as windows and entrances, using prior knowledge about their possible configurations within facade images. We represent facade objects by a simplified rectangular object model and present an energy model, which evaluates the agreement of a proposed configuration with the given image and the statistics about typical configurations, which we learned from training data. We show promising results on different datasets and provide a qualitative evaluation, which demonstrates the capability of complete and accurate detection of facade objects.
Water isotope systematics: Improving our palaeoclimate interpretations
NASA Astrophysics Data System (ADS)
Jones, M. D.; Dee, S.; Anderson, L.; Baker, A.; Bowen, G.; Noone, D. C.
2016-01-01
The stable isotopes of oxygen and hydrogen, measured in a variety of archives, are widely used proxies in Quaternary Science. Understanding the processes that control δ18O change have long been a focus of research (e.g. Shackleton and Opdyke, 1973; Talbot, 1990; Leng, 2006). Both the dynamics of water isotope cycling and the appropriate interpretation of geological water-isotope proxy time series remain subjects of active research and debate. It is clear that achieving a complete understanding of the isotope systematics for any given archive type, and ideally each individual archive, is vital if these palaeo-data are to be used to their full potential, including comparison with climate model experiments of the past. Combining information from modern monitoring and process studies, climate models, and proxy data is crucial for improving our statistical constraints on reconstructions of past climate variability.
Cerebral lateralization in simultaneous interpretation.
Fabbro, F; Gran, L; Basso, G; Bava, A
1990-07-01
Cerebral asymmetries for L1 (Italian), L2 (English), and L3 (French, German, Spanish, or Russian) were studied, by using a verbal-manual interference paradigm, in a group of Italian right-handed polyglot female students at the Scuola Superiore di Lingue Moderne per Interpreti e Traduttori (SSLM-School for Interpreters and Translators) of the University of Trieste and in a control group of right-handed monolingual female students at the Medical School of the University of Trieste. In an automatic speech production task no significant cerebral lateralization was found for the mother tongue (L1) either in the interpreting students or in the control group; the interpreting students were not significantly lateralized for the third language (L3), while weak left hemispheric lateralization was shown for L2. A significantly higher degree of verbal-manual interference was found for L1 than for L2 and L3. A significantly higher disruption rate occurred in the meaning-based mode of simultaneous interpretation (from L2 into L1 and vice versa) than in the word-for-word mode (from L2 into L1 and vice versa). No significant overall or hemispheric differences were found during simultaneous interpretation from L1 into L2 or from L2 into L1. PMID:2207622
A Local Interpretation of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Lopez, Carlos
2016-04-01
A local interpretation of quantum mechanics is presented. Its main ingredients are: first, a label attached to one of the "virtual" paths in the path integral formalism, determining the output for measurement of position or momentum; second, a mathematical model for spin states, equivalent to the path integral formalism for point particles in space time, with the corresponding label. The mathematical machinery of orthodox quantum mechanics is maintained, in particular amplitudes of probability and Born's rule; therefore, Bell's type inequalities theorems do not apply. It is shown that statistical correlations for pairs of particles with entangled spins have a description completely equivalent to the two slit experiment, that is, interference (wave like behaviour) instead of non locality gives account of the process. The interpretation is grounded in the experimental evidence of a point like character of electrons, and in the hypothetical existence of a wave like, the de Broglie, companion system. A correspondence between the extended Hilbert spaces of hidden physical states and the orthodox quantum mechanical Hilbert space shows the mathematical equivalence of both theories. Paradoxical behaviour with respect to the action reaction principle is analysed, and an experimental set up, modified two slit experiment, proposed to look for the companion system.
Learning Interpretable SVMs for Biological Sequence Classification
Rätsch, Gunnar; Sonnenburg, Sören; Schäfer, Christin
2006-01-01
Background Support Vector Machines (SVMs) – using a variety of string kernels – have been successfully applied to biological sequence classification problems. While SVMs achieve high classification accuracy they lack interpretability. In many applications, it does not suffice that an algorithm just detects a biological signal in the sequence, but it should also provide means to interpret its solution in order to gain biological insight. Results We propose novel and efficient algorithms for solving the so-called Support Vector Multiple Kernel Learning problem. The developed techniques can be used to understand the obtained support vector decision function in order to extract biologically relevant knowledge about the sequence analysis problem at hand. We apply the proposed methods to the task of acceptor splice site prediction and to the problem of recognizing alternatively spliced exons. Our algorithms compute sparse weightings of substring locations, highlighting which parts of the sequence are important for discrimination. Conclusion The proposed method is able to deal with thousands of examples while combining hundreds of kernels within reasonable time, and reliably identifies a few statistically significant positions. PMID:16723012
ERIC Educational Resources Information Center
Singamsetti, Rao
2007-01-01
In this paper an attempt is made to highlight some issues of interpretation of statistical concepts and interpretation of results as taught in undergraduate Business statistics courses. The use of modern technology in the class room is shown to have increased the efficiency and the ease of learning and teaching in statistics. The importance of…
Interpretational Confounding or Confounded Interpretations of Causal Indicators?
ERIC Educational Resources Information Center
Bainter, Sierra A.; Bollen, Kenneth A.
2014-01-01
In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning…
The Interpretive Approach to Religious Education: Challenging Thompson's Interpretation
ERIC Educational Resources Information Center
Jackson, Robert
2012-01-01
In a recent book chapter, Matthew Thompson makes some criticisms of my work, including the interpretive approach to religious education and the research and activity of Warwick Religions and Education Research Unit. Against the background of a discussion of religious education in the public sphere, my response challenges Thompson's account,…
Statistical Performances of Resistive Active Power Splitter
NASA Astrophysics Data System (ADS)
Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul
2016-03-01
In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.
Presenting your results-II: Inferential statistics.
Omair, Aamir
2012-11-01
The results are the most significant part of the article since they represent the original work of the authors. It is essential to apply the correct statistical tests. Data should be presented in a comprehensive but simple manner that is easily understandable by the general readership. Care should be taken to keep the tables and graphs simple and uncluttered. Also avoid duplication of the information in the text and tables/graphs. The most important thing to keep in consideration is that it is more relevant to present the main findings rather than all the findings of the study. PMID:23866426
Statistical mechanics of community detection
NASA Astrophysics Data System (ADS)
Reichardt, Jörg; Bornholdt, Stefan
2006-07-01
Starting from a general ansatz, we show how community detection can be interpreted as finding the ground state of an infinite range spin glass. Our approach applies to weighted and directed networks alike. It contains the ad hoc introduced quality function from [J. Reichardt and S. Bornholdt, Phys. Rev. Lett. 93, 218701 (2004)] and the modularity Q as defined by Newman and Girvan [Phys. Rev. E 69, 026113 (2004)] as special cases. The community structure of the network is interpreted as the spin configuration that minimizes the energy of the spin glass with the spin states being the community indices. We elucidate the properties of the ground state configuration to give a concise definition of communities as cohesive subgroups in networks that is adaptive to the specific class of network under study. Further, we show how hierarchies and overlap in the community structure can be detected. Computationally efficient local update rules for optimization procedures to find the ground state are given. We show how the ansatz may be used to discover the community around a given node without detecting all communities in the full network and we give benchmarks for the performance of this extension. Finally, we give expectation values for the modularity of random graphs, which can be used in the assessment of statistical significance of community structure.
Statistical mechanics of community detection.
Reichardt, Jörg; Bornholdt, Stefan
2006-07-01
Starting from a general ansatz, we show how community detection can be interpreted as finding the ground state of an infinite range spin glass. Our approach applies to weighted and directed networks alike. It contains the ad hoc introduced quality function from [J. Reichardt and S. Bornholdt, Phys. Rev. Lett. 93, 218701 (2004)] and the modularity Q as defined by Newman and Girvan [Phys. Rev. E 69, 026113 (2004)] as special cases. The community structure of the network is interpreted as the spin configuration that minimizes the energy of the spin glass with the spin states being the community indices. We elucidate the properties of the ground state configuration to give a concise definition of communities as cohesive subgroups in networks that is adaptive to the specific class of network under study. Further, we show how hierarchies and overlap in the community structure can be detected. Computationally efficient local update rules for optimization procedures to find the ground state are given. We show how the ansatz may be used to discover the community around a given node without detecting all communities in the full network and we give benchmarks for the performance of this extension. Finally, we give expectation values for the modularity of random graphs, which can be used in the assessment of statistical significance of community structure. PMID:16907154
Vicarious posttraumatic growth among interpreters.
Splevins, Katie A; Cohen, Keren; Joseph, Stephen; Murray, Craig; Bowley, Jake
2010-12-01
An emerging evidence base indicates that posttraumatic growth might be experienced vicariously by those working alongside trauma survivors. In this study we explored the vicarious experiences of eight interpreters working in a therapeutic setting with asylum seekers and refugees. We adopted a qualitative approach, using semistructured interviews and interpretative phenomenological analysis. Four interrelated themes emerged from the findings: feeling what your client feels, beyond belief, finding your own way to deal with it, and a different person. Although all participants experienced distress, they also perceived themselves to have grown in some way. The implications for a theory of vicarious posttraumatic growth are discussed, along with clinical applications. PMID:20663936
The standard map: From Boltzmann-Gibbs statistics to Tsallis statistics
NASA Astrophysics Data System (ADS)
Tirnakli, Ugur; Borges, Ernesto P.
2016-03-01
As well known, Boltzmann-Gibbs statistics is the correct way of thermostatistically approaching ergodic systems. On the other hand, nontrivial ergodicity breakdown and strong correlations typically drag the system into out-of-equilibrium states where Boltzmann-Gibbs statistics fails. For a wide class of such systems, it has been shown in recent years that the correct approach is to use Tsallis statistics instead. Here we show how the dynamics of the paradigmatic conservative (area-preserving) stan-dard map exhibits, in an exceptionally clear manner, the crossing from one statistics to the other. Our results unambiguously illustrate the domains of validity of both Boltzmann-Gibbs and Tsallis statistical distributions. Since various important physical systems from particle confinement in magnetic traps to autoionization of molecular Rydberg states, through particle dynamics in accelerators and comet dynamics, can be reduced to the standard map, our results are expected to enlighten and enable an improved interpretation of diverse experimental and observational results.
Evaluation of Computer Simulated Baseline Statistics for Use in Item Bias Studies. [Revised].
ERIC Educational Resources Information Center
Rogers, H. Jane; Hambleton, Ronald K.
Although item bias statistics are widely recommended for use in test development and test analysis work, problems arise in their interpretation. The purpose of the present research was to evaluate the validity of logistic test models and computer simulation methods for providing a frame of reference for item bias statistic interpretations.…
Calibrated Peer Review for Interpreting Linear Regression Parameters: Results from a Graduate Course
ERIC Educational Resources Information Center
Enders, Felicity B.; Jenkins, Sarah; Hoverman, Verna
2010-01-01
Biostatistics is traditionally a difficult subject for students to learn. While the mathematical aspects are challenging, it can also be demanding for students to learn the exact language to use to correctly interpret statistical results. In particular, correctly interpreting the parameters from linear regression is both a vital tool and a…
NASA Astrophysics Data System (ADS)
Schieve, William C.; Horwitz, Lawrence P.
2009-04-01
1. Foundations of quantum statistical mechanics; 2. Elementary examples; 3. Quantum statistical master equation; 4. Quantum kinetic equations; 5. Quantum irreversibility; 6. Entropy and dissipation: the microscopic theory; 7. Global equilibrium: thermostatics and the microcanonical ensemble; 8. Bose-Einstein ideal gas condensation; 9. Scaling, renormalization and the Ising model; 10. Relativistic covariant statistical mechanics of many particles; 11. Quantum optics and damping; 12. Entanglements; 13. Quantum measurement and irreversibility; 14. Quantum Langevin equation: quantum Brownian motion; 15. Linear response: fluctuation and dissipation theorems; 16. Time dependent quantum Green's functions; 17. Decay scattering; 18. Quantum statistical mechanics, extended; 19. Quantum transport with tunneling and reservoir ballistic transport; 20. Black hole thermodynamics; Appendix; Index.
Statistical distribution sampling
NASA Technical Reports Server (NTRS)
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Integrative Interpretation of Vocational Interest Inventory Results.
ERIC Educational Resources Information Center
Rubinstein, Malcolm R.
1978-01-01
Examined effectiveness of interpretation of vocational interest inventory results. Data provide limited support for the hypotheses that integrative interpretation is most effective. Significant interactions exist between counselors and interpretation procedures. Failure to find significant differences between traditional-individual and…
Conflicting Interpretations of Scientific Pedagogy
ERIC Educational Resources Information Center
Galamba, Arthur
2016-01-01
Not surprisingly historical studies have suggested that there is a distance between concepts of teaching methods, their interpretations and their actual use in the classroom. This issue, however, is not always pitched to the personal level in historical studies, which may provide an alternative insight on how teachers conceptualise and engage with…
ERIC Educational Resources Information Center
Melton, T. R.
A computer-assisted instruction system, called IT1 (Interpretive Tutor), is described which is intended to assist a student's efforts to learn the content of textual material and to evaluate his efforts toward that goal. The text is represented internally in the form of semantic networks with auxiliary structures which relate network nodes to…
Interpreter Training Program: Program Review.
ERIC Educational Resources Information Center
Massoud, LindaLee
This report describes in detail the deaf interpreter training program offered at Mott Community College (Flint, Michigan). The program features field-based learning experiences, internships, team teaching, a field practicum, the goal of having students meet certification standards, and proficiency examinations. The program has special…
Focus: Oral Interpretation and Drama.
ERIC Educational Resources Information Center
Mullican, James S., Ed.
1976-01-01
The 12 articles in this issue of "Indiana English Journal" are concerned with drama and oral interpretation in the classroom. Titles of articles are: "Up in the Tree, Down in the Cave, and Back to Reading: Creative Dramatics"; "Pantomime: The Stepping Stone to Drama"; "The Living Literature of Readers' Theatre"; "Do-It-Yourself Drama"; "Drama for…
Interpretive Reproduction in Children's Play
ERIC Educational Resources Information Center
Corsaro, William A.
2012-01-01
The author looks at children's play from the perspective of interpretive reproduction, emphasizing the way children create their own unique peer cultures, which he defines as a set of routines, artifacts, values, and concerns that children engage in with their playmates. The article focuses on two types of routines in the peer culture of preschool…
Interpretation of the Weyl tensor
NASA Astrophysics Data System (ADS)
Hofmann, Stefan; Niedermann, Florian; Schneider, Robert
2013-09-01
According to folklore in general relativity, the Weyl tensor can be decomposed into parts corresponding to Newton-like, incoming and outgoing wavelike field components. It is shown here that this one-to-one correspondence does not hold for space-time geometries with cylindrical isometries. This is done by investigating some well-known exact solutions of Einstein’s field equations with whole-cylindrical symmetry, for which the physical interpretation is very clear, but for which the standard Weyl interpretation would give contradictory results. For planar or spherical geometries, however, the standard interpretation works for both static and dynamical space-times. It is argued that one reason for the failure in the cylindrical case is that for waves spreading in two spatial dimensions there is no local criterion to distinguish incoming and outgoing waves already at the linear level. It turns out that Thorne’s local energy notion, subject to certain qualifications, provides an efficient diagnostic tool to extract the proper physical interpretation of the space-time geometry in the case of cylindrical configurations.
Art Lessons: Learning To Interpret.
ERIC Educational Resources Information Center
Carpenter, B. Stephen, II
1999-01-01
When required to interpret works of art, students arrive at a broad-based, well-grounded understanding of the nature, value, and meaning of art in their lives. Teachers should offer art works, like those of Amalia Mesa-Bains, Joseph Stella, and Beverly Buchanan, whose narratives are complex and challenging, but not conceptually dense or…
Design Document. EKG Interpretation Program.
ERIC Educational Resources Information Center
Webb, Sandra M.
This teaching plan is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in acquainting students with the basic skills needed to perform electrocardiographic (ECG or EKG) interpretations. The first part of the teaching plan contains a statement of purpose; audience recommendations; a flow chart detailing…
EKG Interpretation Program. Trainers Manual.
ERIC Educational Resources Information Center
Webb, Sandra M.
This trainer's manual is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in teaching students how to make basic interpretations of their patients' electrocardiographic (EKG) strips. Included in the manual are pre- and posttests and instructional units dealing with the following topics: EKG indicators,…
Interpreting Hymns for Deaf Worshippers.
ERIC Educational Resources Information Center
Maxwell, Madeline M.; Boster, Shirley
1982-01-01
Discusses the special problems of interpreting hymns written in archaic English and then matching words of a translation to music. Addresses the question of whether competence in ASL and knowledge of signs for religious terms are sufficient for hymns to be of value to deaf worshippers. (EKN)
Interpreting Data: The Hybrid Mind
ERIC Educational Resources Information Center
Heisterkamp, Kimberly; Talanquer, Vicente
2015-01-01
The central goal of this study was to characterize major patterns of reasoning exhibited by college chemistry students when analyzing and interpreting chemical data. Using a case study approach, we investigated how a representative student used chemical models to explain patterns in the data based on structure-property relationships. Our results…
Eleven Interpretations of Personal Suffering.
ERIC Educational Resources Information Center
Foley, Daniel P.
This document defines suffering as the affective aspect of the pain experience while the cognitive aspect of the pain experience is the sensation of pain. It considers personal suffering, which mean's one's own suffering, and not the suffering of other people. It notes that a particular interpretation of suffering may be formulated in any number…
Recent Trends in Oral Interpretation.
ERIC Educational Resources Information Center
Armstrong, Chloe
1974-01-01
The field of oral interpretation has been influenced by both the analytical approach to literature study, with significant emphasis on understanding the literary text, and the interpersonal approach. While oral reading may utilize various performance arts or media such as dance, music, or film, the most popular movement currently is Readers…
Martyna, Agnieszka; Michalska, Aleksandra; Zadora, Grzegorz
2015-05-01
The problem of interpretation of common provenance of the samples within the infrared spectra database of polypropylene samples from car body parts and plastic containers as well as Raman spectra databases of blue solid and metallic automotive paints was under investigation. The research involved statistical tools such as likelihood ratio (LR) approach for expressing the evidential value of observed similarities and differences in the recorded spectra. Since the LR models can be easily proposed for databases described by a few variables, research focused on the problem of spectra dimensionality reduction characterised by more than a thousand variables. The objective of the studies was to combine the chemometric tools easily dealing with multidimensionality with an LR approach. The final variables used for LR models' construction were derived from the discrete wavelet transform (DWT) as a data dimensionality reduction technique supported by methods for variance analysis and corresponded with chemical information, i.e. typical absorption bands for polypropylene and peaks associated with pigments present in the car paints. Univariate and multivariate LR models were proposed, aiming at obtaining more information about the chemical structure of the samples. Their performance was controlled by estimating the levels of false positive and false negative answers and using the empirical cross entropy approach. The results for most of the LR models were satisfactory and enabled solving the stated comparison problems. The results prove that the variables generated from DWT preserve signal characteristic, being a sparse representation of the original signal by keeping its shape and relevant chemical information. PMID:25757825
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. PMID:24300550
ERIC Educational Resources Information Center
Stanly, Pat
2009-01-01
Rough patches occur at both ends of the education pipeline, as students enter community colleges and move on to work or enrollment in four-year institutions. Career pathways--sequences of coherent, articulated, and rigorous career and academic courses that lead to an industry-recognized certificate or a college degree--are a promising approach to…
Easily forgotten: elderly female prisoners.
Handtke, Violet; Bretschneider, Wiebke; Elger, Bernice; Wangmo, Tenzin
2015-01-01
Women form a growing minority within the worldwide prison population and have special needs and distinct characteristics. Within this group exists a smaller sub-group: elderly female prisoners (EFPs) who require tailored social and health interventions that address their unique needs. Data collected from two prisons in Switzerland housing women prisoners were studied. Overall 26 medical records were analyzed, 13 from EFPs (50+ years) and for comparison 13 from young female prisoners (YFPs, 49 years and younger). Additionally, five semi-structured interviews were conducted with EFPs. Using the layer model of vulnerability, three layers of vulnerability were identified: the "prisoner" layer; followed by the layer of "woman"; both of which are encompassed by the layer of "old age." The analysis of these layers resulted in three main areas where EFPs are particularly vulnerable: their status of "double-minority," health and health-care access, and their social relations. Prison administration and policy-makers need to be more sensitive to gender and age related issues in order to remedy these vulnerabilities. PMID:25661851
Some easily analyzable convolutional codes
NASA Technical Reports Server (NTRS)
Mceliece, R.; Dolinar, S.; Pollara, F.; Vantilborg, H.
1989-01-01
Convolutional codes have played and will play a key role in the downlink telemetry systems on many NASA deep-space probes, including Voyager, Magellan, and Galileo. One of the chief difficulties associated with the use of convolutional codes, however, is the notorious difficulty of analyzing them. Given a convolutional code as specified, say, by its generator polynomials, it is no easy matter to say how well that code will perform on a given noisy channel. The usual first step in such an analysis is to computer the code's free distance; this can be done with an algorithm whose complexity is exponential in the code's constraint length. The second step is often to calculate the transfer function in one, two, or three variables, or at least a few terms in its power series expansion. This step is quite hard, and for many codes of relatively short constraint lengths, it can be intractable. However, a large class of convolutional codes were discovered for which the free distance can be computed by inspection, and for which there is a closed-form expression for the three-variable transfer function. Although for large constraint lengths, these codes have relatively low rates, they are nevertheless interesting and potentially useful. Furthermore, the ideas developed here to analyze these specialized codes may well extend to a much larger class.
Explorations in Statistics: Regression
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2011-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive connection.…
Multidimensional Visual Statistical Learning
ERIC Educational Resources Information Center
Turk-Browne, Nicholas B.; Isola, Phillip J.; Scholl, Brian J.; Treat, Teresa A.
2008-01-01
Recent studies of visual statistical learning (VSL) have demonstrated that statistical regularities in sequences of visual stimuli can be automatically extracted, even without intent or awareness. Despite much work on this topic, however, several fundamental questions remain about the nature of VSL. In particular, previous experiments have not…
ERIC Educational Resources Information Center
Huberty, Carl J.
An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…
Deconstructing Statistical Analysis
ERIC Educational Resources Information Center
Snell, Joel
2014-01-01
Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…
Explorations in Statistics: Power
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…
ERIC Educational Resources Information Center
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A
2008-01-01
Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.
Understanding Undergraduate Statistical Anxiety
ERIC Educational Resources Information Center
McKim, Courtney
2014-01-01
The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…
Croarkin, M. Carroll
2001-01-01
For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST.
ERIC Educational Resources Information Center
Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain
2004-01-01
Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…
ERIC Educational Resources Information Center
Council of Ontario Universities, Toronto.
Summary statistics on application and registration patterns of applicants wishing to pursue full-time study in first-year places in Ontario universities (for the fall of 1987) are given. Data on registrations were received indirectly from the universities as part of their annual submission of USIS/UAR enrollment data to Statistics Canada and MCU.…
Introduction to Statistical Physics
NASA Astrophysics Data System (ADS)
Casquilho, João Paulo; Ivo Cortez Teixeira, Paulo
2014-12-01
Preface; 1. Random walks; 2. Review of thermodynamics; 3. The postulates of statistical physics. Thermodynamic equilibrium; 4. Statistical thermodynamics – developments and applications; 5. The classical ideal gas; 6. The quantum ideal gas; 7. Magnetism; 8. The Ising model; 9. Liquid crystals; 10. Phase transitions and critical phenomena; 11. Irreversible processes; Appendixes; Index.
Reform in Statistical Education
ERIC Educational Resources Information Center
Huck, Schuyler W.
2007-01-01
Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…
Statistical Mapping by Computer.
ERIC Educational Resources Information Center
Utano, Jack J.
The function of a statistical map is to provide readers with a visual impression of the data so that they may be able to identify any geographic characteristics of the displayed phenomena. The increasingly important role played by the computer in the production of statistical maps is manifested by the varied examples of computer maps in recent…
The purpose of the Disability Statistics Center is to produce and disseminate statistical information on disability and the status of people with disabilities in American society and to establish and monitor indicators of how conditions are changing over time to meet their health...
Statistical controversies in clinical research: statistical significance-too much of a good thing ….
Buyse, M; Hurvitz, S A; Andre, F; Jiang, Z; Burris, H A; Toi, M; Eiermann, W; Lindsay, M-A; Slamon, D
2016-05-01
The use and interpretation of P values is a matter of debate in applied research. We argue that P values are useful as a pragmatic guide to interpret the results of a clinical trial, not as a strict binary boundary that separates real treatment effects from lack thereof. We illustrate our point using the result of BOLERO-1, a randomized, double-blind trial evaluating the efficacy and safety of adding everolimus to trastuzumab and paclitaxel as first-line therapy for HER2+ advanced breast cancer. In this trial, the benefit of everolimus was seen only in the predefined subset of patients with hormone receptor-negative breast cancer at baseline (progression-free survival hazard ratio = 0.66, P = 0.0049). A strict interpretation of this finding, based on complex 'alpha splitting' rules to assess statistical significance, led to the conclusion that the benefit of everolimus was not statistically significant either overall or in the subset. We contend that this interpretation does not do justice to the data, and we argue that the benefit of everolimus in hormone receptor-negative breast cancer is both statistically compelling and clinically relevant. PMID:26861602
Interpretation of fluorescence correlation spectra of biopolymer solutions.
Phillies, George D J
2016-05-01
Fluorescence correlation spectroscopy (FCS) is regularly used to study diffusion in non-dilute "crowded" biopolymer solutions, including the interior of living cells. For fluorophores in dilute solution, the relationship between the FCS spectrum G(t) and the diffusion coefficient D is well-established. However, the dilute-solution relationship between G(t) and D has sometimes been used to interpret FCS spectra of fluorophores in non-dilute solutions. Unfortunately, the relationship used to interpret FCS spectra in dilute solutions relies on an assumption that is not always correct in non-dilute solutions. This paper obtains the correct form for interpreting FCS spectra of non-dilute solutions, writing G(t) in terms of the statistical properties of the fluorophore motions. Approaches for applying this form are discussed. PMID:26756528
Interpretation of large-scale deviations from the Hubble flow
NASA Astrophysics Data System (ADS)
Grinstein, B.; Politzer, H. David; Rey, S.-J.; Wise, Mark B.
1987-03-01
The theoretical expectation for large-scale streaming velocities relative to the Hubble flow is expressed in terms of statistical correlation functions. Only for objects that trace the mass would these velocities have a simple cosmological interpretation. If some biasing effects the objects' formation, then nonlinear gravitational evolution is essential to predicting the expected large-scale velocities, which also depend on the nature of the biasing.
College Students' Interpretation of Research Reports on Group Differences: The Tall-Tale Effect
ERIC Educational Resources Information Center
Hogan, Thomas P.; Zaboski, Brian A.; Perry, Tiffany R.
2015-01-01
How does the student untrained in advanced statistics interpret results of research that reports a group difference? In two studies, statistically untrained college students were presented with abstracts or professional associations' reports and asked for estimates of scores obtained by the original participants in the studies. These estimates…
Modelling Metamorphism by Abstract Interpretation
NASA Astrophysics Data System (ADS)
Dalla Preda, Mila; Giacobazzi, Roberto; Debray, Saumya; Coogan, Kevin; Townsend, Gregg M.
Metamorphic malware apply semantics-preserving transformations to their own code in order to foil detection systems based on signature matching. In this paper we consider the problem of automatically extract metamorphic signatures from these malware. We introduce a semantics for self-modifying code, later called phase semantics, and prove its correctness by showing that it is an abstract interpretation of the standard trace semantics. Phase semantics precisely models the metamorphic code behavior by providing a set of traces of programs which correspond to the possible evolutions of the metamorphic code during execution. We show that metamorphic signatures can be automatically extracted by abstract interpretation of the phase semantics, and that regular metamorphism can be modelled as finite state automata abstraction of the phase semantics.
Pediatric DXA: technique and interpretation
Henwood, Maria J.
2006-01-01
This article reviews dual X-ray absorptiometry (DXA) technique and interpretation with emphasis on the considerations unique to pediatrics. Specifically, the use of DXA in children requires the radiologist to be a “clinical pathologist” monitoring the technical aspects of the DXA acquisition, a “statistician” knowledgeable in the concepts of Z-scores and least significant changes, and a “bone specialist” providing the referring clinician a meaningful context for the numeric result generated by DXA. The patient factors that most significantly influence bone mineral density are discussed and are reviewed with respect to available normative databases. The effects the growing skeleton has on the DXA result are also presented. Most important, the need for the radiologist to be actively involved in the technical and interpretive aspects of DXA is stressed. Finally, the diagnosis of osteoporosis should not be made on DXA results alone but should take into account other patient factors. PMID:16715219
Clinical Interpretation of Genomic Variations.
Sayitoğlu, Müge
2016-09-01
Novel high-throughput sequencing technologies generate large-scale genomic data and are used extensively for disease mapping of monogenic and/or complex disorders, personalized treatment, and pharmacogenomics. Next-generation sequencing is rapidly becoming routine tool for diagnosis and molecular monitoring of patients to evaluate therapeutic efficiency. The next-generation sequencing platforms generate huge amounts of genetic variation data and it remains a challenge to interpret the variations that are identified. Such data interpretation needs close collaboration among bioinformaticians, clinicians, and geneticists. There are several problems that must be addressed, such as the generation of new algorithms for mapping and annotation, harmonization of the terminology, correct use of nomenclature, reference genomes for different populations, rare disease variant databases, and clinical reports. PMID:27507302
Paleomicrobiology Data: Authentification and Interpretation.
Drancourt, Michel
2016-06-01
The authenticity of some of the very first works in the field of paleopathology has been questioned, and standards have been progressively established for the experiments and the interpretation of data. Whereas most problems initially arose from the contamination of ancient specimens with modern human DNA, the situation is different in the field of paleomicrobiology, in which the risk for contamination is well-known and adequately managed by any laboratory team with expertise in the routine diagnosis of modern-day infections. Indeed, the exploration of ancient microbiota and pathogens is best done by such laboratory teams, with research directed toward the discovery and implementation of new techniques and the interpretation of data. PMID:27337456
Ector, Hugo
2010-12-01
I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected. PMID:21302664
Interpreting geological structure using kriging
Mao, N.
1985-07-01
We applied kriging (geostatistics) to interpret the structure of basement rock in Yucca Flat, NTS from borehole data. The estimation error for 118 data is 81 m comparable with those based on both gravity and borehole data. Using digitized topographic data, we tested the kriging results and found that the model validation process (Thomas option) on data gave a fair representation of the overall uncertainty of the kriged values. 5 refs., 6 figs., 2 tabs.
Consistent interpretations of quantum mechanics
NASA Astrophysics Data System (ADS)
Omnès, Roland
1992-04-01
Within the last decade, significant progress has been made towards a consistent and complete reformulation of the Copenhagen interpretation (an interpretation consisting in a formulation of the experimental aspects of physics in terms of the basic formalism; it is consistent if free from internal contradiction and complete if it provides precise predictions for all experiments). The main steps involved decoherence (the transition from linear superpositions of macroscopic states to a mixing), Griffiths histories describing the evolution of quantum properties, a convenient logical structure for dealing with histories, and also some progress in semiclassical physics, which was made possible by new methods. The main outcome is a theory of phenomena, viz., the classically meaningful properties of a macroscopic system. It shows in particular how and when determinism is valid. This theory can be used to give a deductive form to measurement theory, which now covers some cases that were initially devised as counterexamples against the Copenhagen interpretation. These theories are described, together with their applications to some key experiments and some of their consequences concerning epistemology.
Winters, Ryan; Winters, Andrew; Amedee, Ronald G.
2010-01-01
The Accreditation Council for Graduate Medical Education sets forth a number of required educational topics that must be addressed in residency and fellowship programs. We sought to provide a primer on some of the important basic statistical concepts to consider when examining the medical literature. It is not essential to understand the exact workings and methodology of every statistical test encountered, but it is necessary to understand selected concepts such as parametric and nonparametric tests, correlation, and numerical versus categorical data. This working knowledge will allow you to spot obvious irregularities in statistical analyses that you encounter. PMID:21603381
Statistics of football dynamics
NASA Astrophysics Data System (ADS)
Mendes, R. S.; Malacarne, L. C.; Anteneodo, C.
2007-06-01
We investigate the dynamics of football matches. Our goal is to characterize statistically the temporal sequence of ball movements in this collective sport game, searching for traits of complex behavior. Data were collected over a variety of matches in South American, European and World championships throughout 2005 and 2006. We show that the statistics of ball touches presents power-law tails and can be described by q-gamma distributions. To explain such behavior we propose a model that provides information on the characteristics of football dynamics. Furthermore, we discuss the statistics of duration of out-of-play intervals, not directly related to the previous scenario.
Understanding AOP through the Study of Interpreters
NASA Technical Reports Server (NTRS)
Filman, Robert E.
2004-01-01
I return to the question of what distinguishes AOP languages by considering how the interpreters of AOP languages differ from conventional interpreters. Key elements for static transformation are seen to be redefinition of the set and lookup operators in the interpretation of the language. This analysis also yields a definition of crosscutting in terms of interlacing of interpreter actions.
What Does It Mean to Teach "Interpretively"?
ERIC Educational Resources Information Center
Dodge, Jennifer; Holtzman, Richard; van Hulst, Merlijn; Yanow, Dvora
2016-01-01
The "interpretive turn" has gained traction as a research approach in recent decades in the empirical social sciences. While the contributions of interpretive research and interpretive research methods are clear, we wonder: Does an interpretive perspective lend itself to--or even demand--a particular style of teaching? This question was…
Computer Interpretations of ECGs in Rural Hospitals
Thompson, James M.
1992-01-01
Computer-assisted interpretation of electrocardiograms offers theoretical benefits to rural physicians. This study compared computer-assisted interpretations by a rural physician certified to read ECGs with interpretations by the computer alone. The computer interpretation alone could have led to major errors in patient management, but was correct sufficiently often to warrant purchase by small rural hospitals. PMID:21221365
Playing at Statistical Mechanics
ERIC Educational Resources Information Center
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Cooperative Learning in Statistics.
ERIC Educational Resources Information Center
Keeler, Carolyn M.; And Others
1994-01-01
Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)
Understanding Solar Flare Statistics
NASA Astrophysics Data System (ADS)
Wheatland, M. S.
2005-12-01
A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.
Titanic: A Statistical Exploration.
ERIC Educational Resources Information Center
Takis, Sandra L.
1999-01-01
Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...
NASA Astrophysics Data System (ADS)
Grégoire, G.
2016-05-01
This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.
Tuberculosis Data and Statistics
... Organization Chart Advisory Groups Federal TB Task Force Data and Statistics Language: English Español (Spanish) Recommend on ... United States publication. PDF [6 MB] Interactive TB Data Tool Online Tuberculosis Information System (OTIS) OTIS is ...
NASA Astrophysics Data System (ADS)
Richfield, Jon; bookfeller
2016-07-01
In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.
... facts and statistics here include brain and central nervous system tumors (including spinal cord, pituitary and pineal gland ... U.S. living with a primary brain and central nervous system tumor. This year, nearly 17,000 people will ...
Purposeful Statistical Investigations
ERIC Educational Resources Information Center
Day, Lorraine
2014-01-01
Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.
Oakland, J.S.
1986-01-01
Addressing the increasing importance for firms to have a thorough knowledge of statistically based quality control procedures, this book presents the fundamentals of statistical process control (SPC) in a non-mathematical, practical way. It provides real-life examples and data drawn from a wide variety of industries. The foundations of good quality management and process control, and control of conformance and consistency during production are given. Offers clear guidance to those who wish to understand and implement modern SPC techniques.
Transportation Statistics Annual Report 1997
Fenn, M.
1997-01-01
This document is the fourth Transportation Statistics Annual Report (TSAR) prepared by the Bureau of Transportation Statistics (BTS) for the President and Congress. As in previous years, it reports on the state of U.S. transportation system at two levels. First, in Part I, it provides a statistical and interpretive survey of the system—its physical characteristics, its economic attributes, aspects of its use and performance, and the scale and severity of unintended consequences of transportation, such as fatalities and injuries, oil import dependency, and environment impacts. Part I also explores the state of transportation statistics, and new needs of the rapidly changing world of transportation. Second, Part II of the report, as in prior years, explores in detail the performance of the U.S. transportation system from the perspective of desired social outcomes or strategic goals. This year, the performance aspect of transportation chosen for thematic treatment is “Mobility and Access,” which complements past TSAR theme sections on “The Economic Performance of Transportation” (1995) and “Transportation and the Environment” (1996). Mobility and access are at the heart of the transportation system’s performance from the user’s perspective. In what ways and to what extent does the geographic freedom provided by transportation enhance personal fulfillment of the nation’s residents and contribute to economic advancement of people and businesses? This broad question underlies many of the topics examined in Part II: What is the current level of personal mobility in the United States, and how does it vary by sex, age, income level, urban or rural location, and over time? What factors explain variations? Has transportation helped improve people’s access to work, shopping, recreational facilities, and medical services, and in what ways and in what locations? How have barriers, such as age, disabilities, or lack of an automobile, affected these
A new statistical tool for NOAA local climate studies
NASA Astrophysics Data System (ADS)
Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.
2011-12-01
The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially
Statistical Physics of Particles
NASA Astrophysics Data System (ADS)
Kardar, Mehran
2006-06-01
Statistical physics has its origins in attempts to describe the thermal properties of matter in terms of its constituent particles, and has played a fundamental role in the development of quantum mechanics. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook introduces the central concepts and tools of statistical physics. It contains a chapter on probability and related issues such as the central limit theorem and information theory, and covers interacting particles, with an extensive description of the van der Waals equation and its derivation by mean field approximation. It also contains an integrated set of problems, with solutions to selected problems at the end of the book. It will be invaluable for graduate and advanced undergraduate courses in statistical physics. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873420. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 89 exercises, with solutions to selected problems Contains chapters on probability and interacting particles Ideal for graduate courses in Statistical Mechanics
NASA Astrophysics Data System (ADS)
Kardar, Mehran
2006-06-01
While many scientists are familiar with fractals, fewer are familiar with the concepts of scale-invariance and universality which underly the ubiquity of their shapes. These properties may emerge from the collective behaviour of simple fundamental constituents, and are studied using statistical field theories. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook demonstrates how such theories are formulated and studied. Perturbation theory, exact solutions, renormalization groups, and other tools are employed to demonstrate the emergence of scale invariance and universality, and the non-equilibrium dynamics of interfaces and directed paths in random media are discussed. Ideal for advanced graduate courses in statistical physics, it contains an integrated set of problems, with solutions to selected problems at the end of the book. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873413. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 65 exercises, with solutions to selected problems Features a thorough introduction to the methods of Statistical Field theory Ideal for graduate courses in Statistical Physics
R.A. Fisher's contributions to genetical statistics.
Thompson, E A
1990-12-01
R. A. Fisher (1890-1962) was a professor of genetics, and many of his statistical innovations found expression in the development of methodology in statistical genetics. However, whereas his contributions in mathematical statistics are easily identified, in population genetics he shares his preeminence with Sewall Wright (1889-1988) and J. B. S. Haldane (1892-1965). This paper traces some of Fisher's major contributions to the foundations of statistical genetics, and his interactions with Wright and with Haldane which contributed to the development of the subject. With modern technology, both statistical methodology and genetic data are changing. Nonetheless much of Fisher's work remains relevant, and may even serve as a foundation for future research in the statistical analysis of DNA data. For Fisher's work reflects his view of the role of statistics in scientific inference, expressed in 1949: There is no wide or urgent demand for people who will define methods of proof in set theory in the name of improving mathematical statistics. There is a widespread and urgent demand for mathematicians who understand that branch of mathematics known as theoretical statistics, but who are capable also of recognising situations in the real world to which such mathematics is applicable. In recognising features of the real world to which his models and analyses should be applicable, Fisher laid a lasting foundation for statistical inference in genetic analyses. PMID:2085639
Interpretations of cosmological spectral shifts
NASA Astrophysics Data System (ADS)
Østvang, Dag
2013-03-01
It is shown that for Robertson-Walker models with flat or closed space sections, all of the cosmological spectral shift can be attributed to the non-flat connection (and thus indirectly to space-time curvature). For Robertson-Walker models with hyperbolic space sections, it is shown that cosmological spectral shifts uniquely split up into "kinematic" and "gravitational" parts provided that distances are small. For large distances no such unique split-up exists in general. A number of common, but incorrect assertions found in the literature regarding interpretations of cosmological spectral shifts, is pointed out.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
Udey, Ruth Norma
2013-01-01
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
ERIC Educational Resources Information Center
Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.
2012-01-01
Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…
Genetics in geographically structured populations: defining, estimating and interpreting FST
Holsinger, Kent E.; Weir, Bruce S.
2015-01-01
Wright’s F-statistics, and especially FST, provide important insights into the evolutionary processes that influence the structure of genetic variation within and among populations, and they are among the most widely used descriptive statistics in population and evolutionary genetics. Estimates of FST can identify regions of the genome that have been the target of selection, and comparisons of FST from different parts of the genome can provide insights into the demographic history of populations. For these reasons and others, FST has a central role in population and evolutionary genetics and has wide applications in fields that range from disease association mapping to forensic science. This Review clarifies how FST is defined, how it should be estimated, how it is related to similar statistics and how estimates of FST should be interpreted. PMID:19687804
Ducci, Daniela; de Melo, M Teresa Condesso; Preziosi, Elisabetta; Sellerino, Mariangela; Parrone, Daniele; Ribeiro, Luis
2016-11-01
The natural background level (NBL) concept is revisited and combined with indicator kriging method to analyze the spatial distribution of groundwater quality within a groundwater body (GWB). The aim is to provide a methodology to easily identify areas with the same probability of exceeding a given threshold (which may be a groundwater quality criteria, standards, or recommended limits for selected properties and constituents). Three case studies with different hydrogeological settings and located in two countries (Portugal and Italy) are used to derive NBL using the preselection method and validate the proposed methodology illustrating its main advantages over conventional statistical water quality analysis. Indicator kriging analysis was used to create probability maps of the three potential groundwater contaminants. The results clearly indicate the areas within a groundwater body that are potentially contaminated because the concentrations exceed the drinking water standards or even the local NBL, and cannot be justified by geogenic origin. The combined methodology developed facilitates the management of groundwater quality because it allows for the spatial interpretation of NBL values. PMID:27371772
Statistical Physics of Fracture
Alava, Mikko; Nukala, Phani K; Zapperi, Stefano
2006-05-01
Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.
Statistical Downscaling: Lessons Learned
NASA Astrophysics Data System (ADS)
Walton, D.; Hall, A. D.; Sun, F.
2013-12-01
In this study, we examine ways to improve statistical downscaling of general circulation model (GCM) output. Why do we downscale GCM output? GCMs have low resolution, so they cannot represent local dynamics and topographic effects that cause spatial heterogeneity in the regional climate change signal. Statistical downscaling recovers fine-scale information by utilizing relationships between the large-scale and fine-scale signals to bridge this gap. In theory, the downscaled climate change signal is more credible and accurate than its GCM counterpart, but in practice, there may be little improvement. Here, we tackle the practical problems that arise in statistical downscaling, using temperature change over the Los Angeles region as a test case. This region is an ideal place to apply downscaling since its complex topography and shoreline are poorly simulated by GCMs. By comparing two popular statistical downscaling methods and one dynamical downscaling method, we identify issues with statistically downscaled climate change signals and develop ways to fix them. We focus on scale mismatch, domain of influence, and other problems - many of which users may be unaware of - and discuss practical solutions.
Conflicting Interpretations of Scientific Pedagogy
NASA Astrophysics Data System (ADS)
Galamba, Arthur
2016-05-01
Not surprisingly historical studies have suggested that there is a distance between concepts of teaching methods, their interpretations and their actual use in the classroom. This issue, however, is not always pitched to the personal level in historical studies, which may provide an alternative insight on how teachers conceptualise and engage with concepts of teaching methods. This article provides a case study on this level of conceptualisation by telling the story of Rómulo de Carvalho, an educator from mid-twentieth century Portugal, who for over 40 years engaged with the heuristic and Socratic methods. The overall argument is that concepts of teaching methods are open to different interpretations and are conceptualised within the melting pot of external social pressures and personal teaching preferences. The practice and thoughts of Carvalho about teaching methods are scrutinised to unveil his conflicting stances: Carvalho was a man able to question the tenets of heurism, but who publicly praised the heurism-like "discovery learning" method years later. The first part of the article contextualises the arrival of heurism in Portugal and how Carvalho attacked its philosophical tenets. In the second part, it dwells on his conflicting positions in relation to pupil-centred approaches. The article concludes with an appreciation of the embedded conflicting nature of the appropriation of concepts of teaching methods, and of Carvalho's contribution to the development of the philosophy of practical work in school science.
ERIC Educational Resources Information Center
van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan
2011-01-01
The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of the Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives…
Pocock, Stuart J; McMurray, John J V; Collier, Tim J
2015-12-15
This paper tackles several statistical controversies that are commonly faced when reporting a major clinical trial. Topics covered include: multiplicity of data, interpreting secondary endpoints and composite endpoints, the value of covariate adjustment, the traumas of subgroup analysis, assessing individual benefits and risks, alternatives to analysis by intention to treat, interpreting surprise findings (good and bad), and the overall quality of clinical trial reports. All is put in the context of topical cardiology trial examples and is geared to help trialists steer a wise course in their statistical reporting, thereby giving readers a balanced account of trial findings. PMID:26670066
At 11 months, prosody still outranks statistics.
Johnson, Elizabeth K; Seidl, Amanda H
2009-01-01
English-learning 7.5-month-olds are heavily biased to perceive stressed syllables as word onsets. By 11 months, however, infants begin segmenting non-initially stressed words from speech. Using the same artificial language methodology as Johnson and Jusczyk (2001), we explored the possibility that the emergence of this ability is linked to a decreased reliance on prosodic cues to word boundaries accompanied by an increased reliance on syllable distribution cues. In a baseline study, where only statistical cues to word boundaries were present, infants exhibited a familiarity preference for statistical words. When conflicting stress cues were added to the speech stream, infants exhibited a familiarity preference for stress as opposed to statistical words. This was interpreted as evidence that 11-month-olds weight stress cues to word boundaries more heavily than statistical cues. Experiment 2 further investigated these results with a language containing convergent cues to word boundaries. The results of Experiment 2 were not conclusive. A third experiment using new stimuli and a different experimental design supported the conclusion that 11-month-olds rely more heavily on prosodic than statistical cues to word boundaries. We conclude that the emergence of the ability to segment non-initially stressed words from speech is not likely to be tied to an increased reliance on syllable distribution cues relative to stress cues, but instead may emerge due to an increased reliance on and integration of a broad array of segmentation cues. PMID:19120421
Suite versus composite statistics
Balsillie, J.H.; Tanner, W.F.
1999-01-01
Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.
Candidate Assembly Statistical Evaluation
1998-07-15
The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less
Biostratinomic utility of Archimedes in environmental interpretation
Wulff, J.I. )
1990-04-01
Biostratinomic information from the bryozoan Archimedes can be used to infer paleocurrent senses when other more traditional sedimentary structures are lacking. As with other elongate particles, Archimedes zooaria become oriented in the current and, upon settling, preserve a sense of the flow direction. Orientations and lengths were measured on over 200 individuals from bedding plane exposures in the Upper Mississippian Union Limestone (Greenbrier Group) of West Virginia. These were separated into long and short populations and plotted on rose diagrams. The results show that long and short segments become preferentially oriented in the current and the bimodally distributed long segments can be used to infer the current sense. The current sense is defined by the line which bisects the obtuse angle created by the two maxima in the rose diagram for long segments. Statistical evaluation of the long and short populations indicate they are significant to the 99.9 percent level. Elongate fossils such as Archimedes can be used in paleocurrent evaluations and can add more detail to the interpretation of paleodepositional conditions.
Hadziabdic, Emina
2016-01-01
The aim of this pilot study was to investigate Ukrainian-speaking migrants' attitudes to the use of interpreters in healthcare service in order to test a developed questionnaire and recruitment strategy. A descriptive survey using a 51-item structured self-administered questionnaire of 12 Ukrainian-speaking migrants' and analyzed by the descriptive statistics. The findings were to have an interpreter as an objective communication and practical aid with personal qualities such as a good knowledge of languages and translation ability. In contrast, the clothes worn by the interpreter and the interpreter's religion were not viewed as important aspects. The findings support the method of a developed questionnaire and recruitment strategy, which in turn can be used in a larger planned investigation of the same topic in order to arrange a good interpretation situation in accordance with persons' desire irrespective of countries' different rules in healthcare policies regarding interpretation. PMID:27014391
A statistical natural language processor for medical reports.
Taira, R. K.; Soderland, S. G.
1999-01-01
Statistical natural language processors have been the focus of much research during the past decade. The main advantage of such an approach over grammatical rule-based approaches is its scalability to new domains. We present a statistical NLP for the domain of radiology and report on methods of knowledge acquisition, parsing, semantic interpretation, and evaluation. Preliminary performance data are given. A discussion of the perceived benefit, limitations and future work is presented. PMID:10566505
Statistical model with a standard Γ distribution
NASA Astrophysics Data System (ADS)
Patriarca, Marco; Chakraborti, Anirban; Kaski, Kimmo
2004-07-01
We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ . We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ . Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T(λ) , where particles exchange energy in a space with an effective dimension D(λ) .
Application of Statistics in Establishing Diagnostic Certainty
Denegar, Craig R.; Cordova, Mitchell L.
2012-01-01
The examination and assessment of injured and ill patients leads to the establishment of a diagnosis. However, the tests and procedures used in health care, including procedures performed by certified athletic trainers, are individually and collectively imperfect in confirming or ruling out a condition of concern. Thus, research into the utility of diagnostic tests is needed to identify the procedures that are most helpful and to indicate the confidence one should place in the results of the test. The purpose of this report is to provide an overview of selected statistical procedures and the interpretation of data appropriate for assessing the utility of diagnostic tests with dichotomous (positive or negative) outcomes, with particular attention to the interpretation of sensitivity and specificity estimates and the reporting of confidence intervals around likelihood ratio estimates. PMID:22488292
FIR statistics of paired galaxies
NASA Technical Reports Server (NTRS)
Sulentic, Jack W.
1990-01-01
Much progress has been made in understanding the effects of interaction on galaxies (see reviews in this volume by Heckman and Kennicutt). Evidence for enhanced emission from galaxies in pairs first emerged in the radio (Sulentic 1976) and optical (Larson and Tinsley 1978) domains. Results in the far infrared (FIR) lagged behind until the advent of the Infrared Astronomy Satellite (IRAS). The last five years have seen numerous FIR studies of optical and IR selected samples of interacting galaxies (e.g., Cutri and McAlary 1985; Joseph and Wright 1985; Kennicutt et al. 1987; Haynes and Herter 1988). Despite all of this work, there are still contradictory ideas about the level and, even, the reality of an FIR enhancement in interacting galaxies. Much of the confusion originates in differences between the galaxy samples that were studied (i.e., optical morphology and redshift coverage). Here, the authors report on a study of the FIR detection properties for a large sample of interacting galaxies and a matching control sample. They focus on the distance independent detection fraction (DF) statistics of the sample. The results prove useful in interpreting the previously published work. A clarification of the phenomenology provides valuable clues about the physics of the FIR enhancement in galaxies.
Verbal framing of statistical evidence drives children's preference inferences.
Garvin, Laura E; Woodward, Amanda L
2015-05-01
Although research has shown that statistical information can support children's inferences about specific psychological causes of others' behavior, previous work leaves open the question of how children interpret statistical information in more ambiguous situations. The current studies investigated the effect of specific verbal framing information on children's ability to infer mental states from statistical regularities in behavior. We found that preschool children inferred others' preferences from their statistically non-random choices only when they were provided with verbal information placing the person's behavior in a specifically preference-related context, not when the behavior was presented in a non-mentalistic action context or an intentional choice context. Furthermore, verbal framing information showed some evidence of supporting children's mental state inferences even from more ambiguous statistical data. These results highlight the role that specific, relevant framing information can play in supporting children's ability to derive novel insights from statistical information. PMID:25704581
Weighted order statistic classifiers with large rank-order margin.
Porter, R. B.; Hush, D. R.; Theiler, J. P.; Gokhale, M.
2003-01-01
We describe how Stack Filters and Weighted Order Statistic function classes can be used for classification problems. This leads to a new design criteria for linear classifiers when inputs are binary-valued and weights are positive . We present a rank-based measure of margin that can be directly optimized as a standard linear program and investigate its effect on generalization error with experiment. Our approach can robustly combine large numbers of base hypothesis and easily implement known priors through regularization.
Structural and Geological Interpretation of Posidonius Crater on the Moon
NASA Astrophysics Data System (ADS)
Ishihara, Y.; Chiba, T.; Haruyama, J.; Otake, H.; Ohtake, M.
2015-12-01
Posidonius crater locates on northeastern rim parts of the Serenitatis basin and is a typical floor-fractured crater. Because of Posidonius is located lunar central nearside and easily observed by ground-base telescope, the complex texture of crater floor attracted planetary scientist attention from before lunar exportation era. However, origin or formation histories of floor fractures are not fully resolved yet. In this study, we try to estimate geologic histories of Posidonius crater based on topographic data and multiband image data obtained by Terrain Camera (TC) and Multiband Imager (MI) onboard Kaguya. A part of crater floor of Posidonius is flooded by mare basalt. Previous studies interpreted that the source of mare basalt is located somewhere at Mare Serenitatis and flooded into Posidonius crater, then sinuous rill (Rimae Posidonius) is the resulting structure of flooded basalt flow. However, based on TC topographic data, sinuous rill feature indicate opposite flow direction to previous interpretations. Based on TC topographic data, we could interpret topographic features as follows; Rimare Posidonius flow from volcanic vent located at northern edge of Posidonius crater floor and flow out to Mare Serenitatis at western rim, the central part of crater floor slightly leaned to west and broken in several regions. From band depth of MI data, eastern part of crater floor is mostly consisted by highland materials and complex rills are basically not showing the basaltic feature. Combined both analysis results, we interpret the cause of complex structure of Posidonius crater is as follows; after crater formation, large sill intruded below crater floor and uppermost layer of crater floor is delaminated from the basement then floats on basaltic intrusion as "otoshibuta" (Japanese style lid for stew). Complex fracture was probably formed delamination and flotation stage by mechanical stress.
Gehrmann, Thies; Reinders, Marcel J.T.
2015-01-01
Background: With more and more genomes being sequenced, detecting synteny between genomes becomes more and more important. However, for microorganisms the genomic divergence quickly becomes large, resulting in different codon usage and shuffling of gene order and gene elements such as exons. Results: We present Proteny, a methodology to detect synteny between diverged genomes. It operates on the amino acid sequence level to be insensitive to codon usage adaptations and clusters groups of exons disregarding order to handle diversity in genomic ordering between genomes. Furthermore, Proteny assigns significance levels to the syntenic clusters such that they can be selected on statistical grounds. Finally, Proteny provides novel ways to visualize results at different scales, facilitating the exploration and interpretation of syntenic regions. We test the performance of Proteny on a standard ground truth dataset, and we illustrate the use of Proteny on two closely related genomes (two different strains of Aspergillus niger) and on two distant genomes (two species of Basidiomycota). In comparison to other tools, we find that Proteny finds clusters with more true homologies in fewer clusters that contain more genes, i.e. Proteny is able to identify a more consistent synteny. Further, we show how genome rearrangements, assembly errors, gene duplications and the conservation of specific genes can be easily studied with Proteny. Availability and implementation: Proteny is freely available at the Delft Bioinformatics Lab website http://bioinformatics.tudelft.nl/dbl/software. Contact: t.gehrmann@tudelft.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26116928
The wetland continuum: a conceptual framework for interpreting biological studies
Euliss, N.H., Jr.; LaBaugh, J.W.; Fredrickson, L.H.; Mushet, D.M.; Swanson, G.A.; Winter, T.C.; Rosenberry, D.O.; Nelson, R.D.
2004-01-01
We describe a conceptual model, the wetland continuum, which allows wetland managers, scientists, and ecologists to consider simultaneously the influence of climate and hydrologic setting on wetland biological communities. Although multidimensional, the wetland continuum is most easily represented as a two-dimensional gradient, with ground water and atmospheric water constituting the horizontal and vertical axis, respectively. By locating the position of a wetland on both axes of the continuum, the potential biological expression of the wetland can be predicted at any point in time. The model provides a framework useful in the organization and interpretation of biological data from wetlands by incorporating the dynamic changes these systems undergo as a result of normal climatic variation rather than placing them into static categories common to many wetland classification systems. While we developed this model from the literature available for depressional wetlands in the prairie pothole region of North America, we believe the concept has application to wetlands in many other geographic locations.
Glaciation of northwestern Wyoming interpreted from ERTS-1
NASA Technical Reports Server (NTRS)
Breckenridge, R. M.
1973-01-01
Analysis of ERTS Imagery has shown a number of alpine glacial features can be recognized and mapped successfully. Although the Wyoming mountains are generally regarded as the type locality for Rocky Mountain glaciation some areas have not been studied from a glacial standpoint because of inaccessibility or lack of topographic control. ERTS imagery provides an excellent base for this type of regional geomorphic study. A map of maximum extent of Wisconsin Ice, flow directions and major glacial features was compiled from interpretation of the ERTS imagery. Features which can be mapped are large moraines, outwash fans and terraces. Present-day glaciers and snowfields are easily discriminated and mapped. Glaciers and glacial deposits which serve as aquifers play a significant role in the hydrologic cycle and are important because of the increasing demand placed on our water resources. ERTS provides a quick and effective method for change detection and inventory of these vital resources.
Interpreting in the Spirit of ADA!
ERIC Educational Resources Information Center
Hughes, Amanda
1995-01-01
Discusses work on a pilot project to develop guidelines for accessible interpretation that integrates disabled and mainstream visitors. Discusses program and facility designs and suggests ways to make existing interpretive programs more accessible to visitors with disabilities. (LZ)
CONTEMPORARY ENVIRONMENTAL APPLICATIONS OF PHOTOGRAPHIC INTERPRETATION
Aerial Photographic Interpretation is a timed-tested technique for extracting landscape- level information from aerial photographs and other types of remotely sensed images. The U.S. Environmental Protection Agency's Environmental Photographic Interpretation Center (EPIC) has a 2...
Interpreting neurodynamics: concepts and facts
Rotter, Stefan
2008-01-01
The dynamics of neuronal systems, briefly neurodynamics, has developed into an attractive and influential research branch within neuroscience. In this paper, we discuss a number of conceptual issues in neurodynamics that are important for an appropriate interpretation and evaluation of its results. We demonstrate their relevance for selected topics of theoretical and empirical work. In particular, we refer to the notions of determinacy and stochasticity in neurodynamics across levels of microscopic, mesoscopic and macroscopic descriptions. The issue of correlations between neural, mental and behavioral states is also addressed in some detail. We propose an informed discussion of conceptual foundations with respect to neurobiological results as a viable step to a fruitful future philosophy of neuroscience. PMID:19003452
Interpretation of rapidly rotating pulsars
Weber, F. . Inst. fuer Theoretische Physik); Glendenning, N.K. )
1992-08-05
The minimum possible rotational period of pulsars, which are interpreted as rotating neutron stars, is determined by applying a representative collection of realistic nuclear equations of state. It is found that none of the selected equations of state allows for neutron star rotation at periods below 0.8--0.9 ms. Thus, this work strongly supports the suggestion that if pulsars with shorter rotational periods were found, these are likely to be strange-quark-matter stars. The conclusion that the confined hadronic phase of nucleons and nuclei is only metastable would then be almost inescapable, and the plausible ground-state in that event is the deconfined phase of (3-flavor) strange-quark-matter.
The interpretation of biological surveys.
Bell, Graham
2003-01-01
Biological surveys provide the raw material for assembling ecological patterns. These include the properties of parameters such as range, abundance, dispersion, evenness and diversity; the relationships between these parameters; the relationship between geographical distributions and landscape structure; and the co-occurrence of species. These patterns have often been used in the past to evaluate the role of ecological processes in structuring natural communities. In this paper, I investigate the patterns produced by simple neutral community models (NCMs) and compare them with the output of systematic biological surveys. The NCM generates qualitatively, and in some cases quantitatively, the same patterns as the survey data. It therefore provides a satisfactory general theory of diversity and distribution, although what patterns can be used to distinguish neutral from adaptationist interpretations of communities, or even whether such patterns exist, remains unclear. PMID:14728774
Proverb interpretation changes in aging.
Uekermann, Jennifer; Thoma, Patrizia; Daum, Irene
2008-06-01
Recent investigations have emphasized the involvement of fronto-subcortical networks to proverb comprehension. Although the prefrontal cortex is thought to be affected by normal aging, relatively little work has been carried out to investigate potential effects of aging on proverb comprehension. In the present investigation participants in three age groups were assessed on a proverb comprehension task and a range of executive function tasks. The older group showed impairment in selecting correct interpretations from alternatives. They also showed executive function deficits, as reflected by reduced working memory and deficient set shifting and inhibition abilities. The findings of the present investigation showed proverb comprehension deficits in normal aging which appeared to be related to reduced executive skills. PMID:18164527
QUALITATIVE INTERPRETATION OF GALAXY SPECTRA
Sanchez Almeida, J.; Morales-Luis, A. B.; Terlevich, R.; Terlevich, E.; Cid Fernandes, R. E-mail: abml@iac.es E-mail: eterlevi@inaoep.mx
2012-09-10
We describe a simple step-by-step guide to qualitative interpretation of galaxy spectra. Rather than an alternative to existing automated tools, it is put forward as an instrument for quick-look analysis and for gaining physical insight when interpreting the outputs provided by automated tools. Though the recipe is for general application, it was developed for understanding the nature of the Automatic Spectroscopic K-means-based (ASK) template spectra. They resulted from the classification of all the galaxy spectra in the Sloan Digital Sky Survey data release 7, thus being a comprehensive representation of the galaxy spectra in the local universe. Using the recipe, we give a description of the properties of the gas and the stars that characterize the ASK classes, from those corresponding to passively evolving galaxies, to H II galaxies undergoing a galaxy-wide starburst. The qualitative analysis is found to be in excellent agreement with quantitative analyses of the same spectra. We compare the mean ages of the stellar populations with those inferred using the code STARLIGHT. We also examine the estimated gas-phase metallicity with the metallicities obtained using electron-temperature-based methods. A number of byproducts follow from the analysis. There is a tight correlation between the age of the stellar population and the metallicity of the gas, which is stronger than the correlations between galaxy mass and stellar age, and galaxy mass and gas metallicity. The galaxy spectra are known to follow a one-dimensional sequence, and we identify the luminosity-weighted mean stellar age as the affine parameter that describes the sequence. All ASK classes happen to have a significant fraction of old stars, although spectrum-wise they are outshined by the youngest populations. Old stars are metal-rich or metal-poor depending on whether they reside in passive galaxies or in star-forming galaxies.
Analogies for Understanding Statistics
ERIC Educational Resources Information Center
Hocquette, Jean-Francois
2004-01-01
This article describes a simple way to explain the limitations of statistics to scientists and students to avoid the publication of misleading conclusions. Biologists examine their results extremely critically and carefully choose the appropriate analytic methods depending on their scientific objectives. However, no such close attention is usually…
Statistical methods in microbiology.
Ilstrup, D M
1990-01-01
Statistical methodology is viewed by the average laboratory scientist, or physician, sometimes with fear and trepidation, occasionally with loathing, and seldom with fondness. Statistics may never be loved by the medical community, but it does not have to be hated by them. It is true that statistical science is sometimes highly mathematical, always philosophical, and occasionally obtuse, but for the majority of medical studies it can be made palatable. The goal of this article has been to outline a finite set of methods of analysis that investigators should choose based on the nature of the variable being studied and the design of the experiment. The reader is encouraged to seek the advice of a professional statistician when there is any doubt about the appropriate method of analysis. A statistician can also help the investigator with problems that have nothing to do with statistical tests, such as quality control, choice of response variable and comparison groups, randomization, and blinding of assessment of response variables. PMID:2200604
Statistical Energy Analysis Program
NASA Technical Reports Server (NTRS)
Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.
1985-01-01
Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.
Statistical Significance Testing.
ERIC Educational Resources Information Center
McLean, James E., Ed.; Kaufman, Alan S., Ed.
1998-01-01
The controversy about the use or misuse of statistical significance testing has become the major methodological issue in educational research. This special issue contains three articles that explore the controversy, three commentaries on these articles, an overall response, and three rejoinders by the first three authors. They are: (1)…
Education Statistics Quarterly, 2003.
ERIC Educational Resources Information Center
Marenus, Barbara; Burns, Shelley; Fowler, William; Greene, Wilma; Knepper, Paula; Kolstad, Andrew; McMillen Seastrom, Marilyn; Scott, Leslie
2003-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…
Spitball Scatterplots in Statistics
ERIC Educational Resources Information Center
Wagaman, John C.
2012-01-01
This paper describes an active learning idea that I have used in my applied statistics class as a first lesson in correlation and regression. Students propel spitballs from various standing distances from the target and use the recorded data to determine if the spitball accuracy is associated with standing distance and review the algebra of lines…
Lack of Statistical Significance
ERIC Educational Resources Information Center
Kehle, Thomas J.; Bray, Melissa A.; Chafouleas, Sandra M.; Kawano, Takuji
2007-01-01
Criticism has been leveled against the use of statistical significance testing (SST) in many disciplines. However, the field of school psychology has been largely devoid of critiques of SST. Inspection of the primary journals in school psychology indicated numerous examples of SST with nonrandom samples and/or samples of convenience. In this…
Juvenile Court Statistics - 1972.
ERIC Educational Resources Information Center
Office of Youth Development (DHEW), Washington, DC.
This report is a statistical study of juvenile court cases in 1972. The data demonstrates how the court is frequently utilized in dealing with juvenile delinquency by the police as well as by other community agencies and parents. Excluded from this report are the ordinary traffic cases handled by juvenile court. The data indicate that: (1) in…
Library Research and Statistics.
ERIC Educational Resources Information Center
Lynch, Mary Jo; St. Lifer, Evan; Halstead, Kent; Fox, Bette-Lee; Miller, Marilyn L.; Shontz, Marilyn L.
2001-01-01
These nine articles discuss research and statistics on libraries and librarianship, including libraries in the United States, Canada, and Mexico; acquisition expenditures in public, academic, special, and government libraries; price indexes; state rankings of public library data; library buildings; expenditures in school library media centers; and…
Foundations of Statistical Seismology
NASA Astrophysics Data System (ADS)
Vere-Jones, David
2010-06-01
A brief account is given of the principles of stochastic modelling in seismology, with special regard to the role and development of stochastic models for seismicity. Stochastic models are seen as arising in a hierarchy of roles in seismology, as in other scientific disciplines. At their simplest, they provide a convenient descriptive tool for summarizing data patterns; in engineering and other applications, they provide a practical way of bridging the gap between the detailed modelling of a complex system, and the need to fit models to limited data; at the most fundamental level they arise as a basic component in the modelling of earthquake phenomena, analogous to that of stochastic models in statistical mechanics or turbulence theory. As an emerging subdiscipline, statistical seismology includes elements of all of these. The scope for the development of stochastic models depends crucially on the quantity and quality of the available data. The availability of extensive, high-quality catalogues and other relevant data lies behind the recent explosion of interest in statistical seismology. At just such a stage, it seems important to review the underlying principles on which statistical modelling is based, and that is the main purpose of the present paper.
Graduate Statistics: Student Attitudes
ERIC Educational Resources Information Center
Kennedy, Robert L.; Broadston, Pamela M.
2004-01-01
This study investigated the attitudes toward statistics of graduate students who used a computer program as part of the instruction, which allowed for an individualized, self-paced, student-centered, activity-based course. The twelve sections involved in this study were offered in the spring and fall 2001, spring and fall 2002, spring and fall…
Geopositional Statistical Methods
NASA Technical Reports Server (NTRS)
Ross, Kenton
2006-01-01
RMSE based methods distort circular error estimates (up to 50% overestimation). The empirical approach is the only statistically unbiased estimator offered. Ager modification to Shultz approach is nearly unbiased, but cumbersome. All methods hover around 20% uncertainty (@ 95% confidence) for low geopositional bias error estimates. This requires careful consideration in assessment of higher accuracy products.
Statistical Reasoning over Lunch
ERIC Educational Resources Information Center
Selmer, Sarah J.; Bolyard, Johnna J.; Rye, James A.
2011-01-01
Students in the 21st century are exposed daily to a staggering amount of numerically infused media. In this era of abundant numeric data, students must be able to engage in sound statistical reasoning when making life decisions after exposure to varied information. The context of nutrition can be used to engage upper elementary and middle school…
Fractional statistics and confinement
NASA Astrophysics Data System (ADS)
Gaete, P.; Wotzasek, C.
2005-02-01
It is shown that a pointlike composite having charge and magnetic moment displays a confining potential for the static interaction while simultaneously obeying fractional statistics in a pure gauge theory in three dimensions, without a Chern-Simons term. This result is distinct from the Maxwell-Chern-Simons theory that shows a screening nature for the potential.
Statistics for Learning Genetics
ERIC Educational Resources Information Center
Charles, Abigail Sheena
2012-01-01
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in,…
ERIC Educational Resources Information Center
Akram, Muhammad; Siddiqui, Asim Jamal; Yasmeen, Farah
2004-01-01
In order to learn the concept of statistical techniques one needs to run real experiments that generate reliable data. In practice, the data from some well-defined process or system is very costly and time consuming. It is difficult to run real experiments during the teaching period in the university. To overcome these difficulties, statisticians…
Digital Image Quality And Interpretability: Database And Hardcopy Studies
NASA Astrophysics Data System (ADS)
Snyder, H. L.; Maddox, M. E.; Shedivy, D. I.; Turpin, J. A.; Burke, J. J.; Strickland, R. N.
1982-02-01
Two hundred fifty transparencies, displaying a new digital database consisting of 25 degraded versions (5 blur levels x 5 noise levels) of each of 10 digitized, first-generation positive transparencies, were used in two experiments involving 15 trained military photointer-preters. Each image is 86 mm square and represents 40962 8-bit pixels. In the "interpretation" experiment, each photointerpreter (judge) spent approximately two days extracting essential elements of information (EEls) from one degraded version of each scene at a constant Gaussian blur level (FWHM = 40, 84, or 322 Am). In the scaling experiment, each judge assigned a numerical value to each of the 250 images, according to its perceived position on a 10-point NATO-standardized scale (0 = useless through 9 = nearly perfect), to the nearest 0.1 unit. Eighty-eight of the 100 possible values were used by the judges, indicating that 62 categories, based on the Shannon-Wiener measure of information, are needed to scale these hardcopy images. The overall correlation between the scaling and interpretation results was 0.9. Though the main effect of blur was not statistically significant in the interpretation experiment, that of noise was significant, and all main factors (blur, noise, scene, order of battle) and most interactions were statistically significant in the scaling experiment.
Statistics for Learning Genetics
NASA Astrophysics Data System (ADS)
Charles, Abigail Sheena
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless
Statistical Physics of Hard Optimization Problems
NASA Astrophysics Data System (ADS)
Zdeborová, Lenka
2008-06-01
Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the NP-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this thesis is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.
Statistical physics of hard optimization problems
NASA Astrophysics Data System (ADS)
Zdeborová, Lenka
2009-06-01
Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the non-deterministic polynomial (NP)-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this article is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.
10 CFR 1016.7 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Interpretations. 1016.7 Section 1016.7 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) SAFEGUARDING OF RESTRICTED DATA General Provisions § 1016.7 Interpretations. Except as specifically authorized by the Secretary of Energy in writing, no interpretation of...
Using Playing Cards to Differentiate Probability Interpretations
ERIC Educational Resources Information Center
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
10 CFR 39.5 - Interpretations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
10 CFR 39.5 - Interpretations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
10 CFR 39.5 - Interpretations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
10 CFR 39.5 - Interpretations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
10 CFR 39.5 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 39.5 Section 39.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
10 CFR 20.1006 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 20.1006 Section 20.1006 Energy NUCLEAR REGULATORY COMMISSION STANDARDS FOR PROTECTION AGAINST RADIATION General Provisions § 20.1006 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of...
10 CFR 73.3 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 73.3 Section 73.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PHYSICAL PROTECTION OF PLANTS AND MATERIALS General Provisions § 73.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretations of the...
10 CFR 73.3 - Interpretations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 73.3 Section 73.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PHYSICAL PROTECTION OF PLANTS AND MATERIALS General Provisions § 73.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretations of the...
10 CFR 73.3 - Interpretations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 73.3 Section 73.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PHYSICAL PROTECTION OF PLANTS AND MATERIALS General Provisions § 73.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretations of the...
10 CFR 73.3 - Interpretations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 2 2013-01-01 2013-01-01 false Interpretations. 73.3 Section 73.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PHYSICAL PROTECTION OF PLANTS AND MATERIALS General Provisions § 73.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretations of the...
10 CFR 73.3 - Interpretations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Interpretations. 73.3 Section 73.3 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PHYSICAL PROTECTION OF PLANTS AND MATERIALS General Provisions § 73.3 Interpretations. Except as specifically authorized by the Commission in writing, no interpretations of the meaning of the regulations in this part by...
10 CFR 26.7 - Interpretations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...
10 CFR 26.7 - Interpretations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...
10 CFR 26.7 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...
10 CFR 26.7 - Interpretations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...
10 CFR 26.7 - Interpretations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations...
The transactional interpretation of quantum mechanics
NASA Astrophysics Data System (ADS)
Cramer, John G.
1986-07-01
The interpretational problems of quantum mechanics are considered. The way in which the standard Copenhagen interpretation of quantum mechanics deals with these problems is reviewed. A new interpretation of the formalism of quantum mechanics, the transactional interpretation, is presented. The basic element of this interpretation is the transaction describing a quantum event as an exchange of advanced and retarded waves, as implied by the work of Wheeler and Feynman, Dirac, and others. The transactional interpretation is explicitly nonlocal and thereby consistent with recent tests of the Bell inequality, yet is relativistically invariant and fully causal. A detailed comparison of the transactional and Copenhagen interpretations is made in the context of well-known quantum-mechanical Gedankenexperimente and "paradoxes." The transactional interpretation permits quantum-mechanical wave functions to be interpreted as real waves physically present in space rather than as "mathematical representations of knowledge" as in the Copenhagen interpretation. The transactional interpretation is shown to provide insight into the complex character of the quantum-mechanical state vector and the mechanism associated with its "collapse." It also leads in a natural way to justification of the Heisenberg uncertainty principle and the Born probability law (P=ψψ*), basic elements of the Copenhagen interpretation.
The Role of Interpreters in Inclusive Classrooms.
ERIC Educational Resources Information Center
Antia, Shirin D.; Kreimeyer, Kathryn H.
2001-01-01
A qualitative 3-year case study followed three deaf interpreters in an inclusive school. Results of interviews indicated that, in addition to sign interpreting, the interpreters clarified teacher directions, facilitated peer interaction, tutored the deaf children, and kept teachers and special educators informed of the deaf children's progress.…
Comprehension and Error Monitoring in Simultaneous Interpreters
ERIC Educational Resources Information Center
Yudes, Carolina; Macizo, Pedro; Morales, Luis; Bajo, M. Teresa
2013-01-01
In the current study we explored lexical, syntactic, and semantic processes during text comprehension in English monolinguals and Spanish/English (first language/second language) bilinguals with different experience in interpreting (nontrained bilinguals, interpreting students and professional interpreters). The participants performed an…
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 2 2013-01-01 2013-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the...
Children's Theory of Mind: An Experiential Interpretation.
ERIC Educational Resources Information Center
Nelson, Katherine; Plesa, Daniela; Henseler, Sarah
1998-01-01
Reconsiders interpretive and theory versions of children's theory of mind. Shows that many college students provide interpretive explanations on theory of mind tasks and that young children rely on background experientially-based knowledge to interpret such tasks. Argues that a logical-causal theory of human action based on mental states is a…
An Online Synchronous Test for Professional Interpreters
ERIC Educational Resources Information Center
Chen, Nian-Shing; Ko, Leong
2010-01-01
This article is based on an experiment designed to conduct an interpreting test for multiple candidates online, using web-based synchronous cyber classrooms. The test model was based on the accreditation test for Professional Interpreters produced by the National Accreditation Authority of Translators and Interpreters (NAATI) in Australia.…
Canonical ensemble in non-extensive statistical mechanics, q > 1
NASA Astrophysics Data System (ADS)
Ruseckas, Julius
2016-09-01
The non-extensive statistical mechanics has been used to describe a variety of complex systems. The maximization of entropy, often used to introduce the non-extensive statistical mechanics, is a formal procedure and does not easily lead to physical insight. In this article we investigate the canonical ensemble in the non-extensive statistical mechanics by considering a small system interacting with a large reservoir via short-range forces and assuming equal probabilities for all available microstates. We concentrate on the situation when the reservoir is characterized by generalized entropy with non-extensivity parameter q > 1. We also investigate the problem of divergence in the non-extensive statistical mechanics occurring when q > 1 and show that there is a limit on the growth of the number of microstates of the system that is given by the same expression for all values of q.
Data analysis using the Gnu R system for statistical computation
Simone, James; /Fermilab
2011-07-01
R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.
Random graph coloring: statistical physics approach.
van Mourik, J; Saad, D
2002-11-01
The problem of vertex coloring in random graphs is studied using methods of statistical physics and probability. Our analytical results are compared to those obtained by exact enumeration and Monte Carlo simulations. We critically discuss the merits and shortcomings of the various methods, and interpret the results obtained. We present an exact analytical expression for the two-coloring problem as well as general replica symmetric approximated solutions for the thermodynamics of the graph coloring problem with p colors and K-body edges. PMID:12513569