Interpreting Accident Statistics
Ferreira, Joseph Jr.
Accident statistics have often been used to support the argument that an abnormally small proportion of drivers account for a large proportion of the accidents. This paper compares statistics developed from six-year data ...
DNA Mixture Interpretation & Statistical Analysis
University funded ~150 state & local lab analysts to attend Catherine Grgicak Boston U. Mike Coble NIST Robin) National recommendations of the technical UK DNA working group on mixture interpretation for the NDNAD and for court going purposes. FSI Genetics 2(1): 7682. · Schneider, P.M., et al. (2009) The German Stain
Statistical mechanics and the ontological interpretation
NASA Astrophysics Data System (ADS)
Bohm, D.; Hiley, B. J.
1996-06-01
To complete our ontological interpretation of quantum theory we have to conclude a treatment of quantum statistical mechanics. The basic concepts in the ontological approach are the particle and the wave function. The density matrix cannot play a fundamental role here. Therefore quantum statistical mechanics will require a further statistical distribution over wave functions in addition to the distribution of particles that have a specified wave function. Ultimately the wave function of the universe will he required, but we show that if the universe in not in thermodynamic equilibrium then it can he treated in terms of weakly interacting large scale constituents that are very nearly independent of each other. In this way we obtain the same results as those of the usual approach within the framework of the ontological interpretation.
Statistical weld process monitoring with expert interpretation
Cook, G.E.; Barnett, R.J.; Strauss, A.M.; Thompson, F.M. Jr.
1996-12-31
A statistical weld process monitoring system is described. Using data of voltage, current, wire feed speed, gas flow rate, travel speed, and elapsed arc time collected while welding, the welding statistical process control (SPC) tool provides weld process quality control by implementing techniques of data trending analysis, tolerance analysis, and sequential analysis. For purposes of quality control, the control limits required for acceptance are specified in the weld procedure acceptance specifications. The control charts then provide quality assurance documentation for each weld. The statistical data trending analysis performed by the SPC program is not only valuable as a quality assurance monitoring and documentation system, it is also valuable in providing diagnostic assistance in troubleshooting equipment and material problems. Possible equipment/process problems are identified and matched with features of the SPC control charts. To aid in interpreting the voluminous statistical output generated by the SPC system, a large number of If-Then rules have been devised for providing computer-based expert advice for pinpointing problems based on out-of-limit variations of the control charts. The paper describes the SPC monitoring tool and the rule-based expert interpreter that has been developed for relating control chart trends to equipment/process problems.
DNA Mixture Interpretation: History, Challenges, Statistical
Guidelines, I cannot speak for or on behalf of the Scientific Working Group on DNA Analysis Methods Points Extraction Validation establishes variation and limits in the processes involved Potential Allele Overlap biologically related individuals. However, there are nuances and limitations to the interpretation
The Statistical Interpretation of Entropy: An Activity
ERIC Educational Resources Information Center
Timmberlake, Todd
2010-01-01
The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…
Interpreting Educational Research Using Statistical Software.
ERIC Educational Resources Information Center
Evans, Elizabeth A.
A live demonstration of how a typical set of educational data can be examined using quantitative statistical software was conducted. The topic of tutorial support was chosen. Setting up a hypothetical research scenario, the researcher created 300 cases from random data generation adjusted to correct obvious error. Each case represented a student…
The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes
ERIC Educational Resources Information Center
Cartier, Stephen F.
2011-01-01
A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…
Statistical Analysis and Interpretation of Discrete Compositional Data
Washington at Seattle, University of
Statistical Analysis and Interpretation of Discrete Compositional Data Dean Billheimer Peter Peter Guttorp University of Washington, Seattle William F. Fagan Arizona State University, Tempe, AZ 6 compositions. Unfortunately, as Aitchison (1986) and others (e.g., Pawlowsky and Burger, 1992) describe
Bose-Einstein condensation and independent emission: statistical physics interpretation
A. Bialas; K. Zalewski
1998-07-15
Recent results on effects of Bose-Einstein symmetrization in a system of independently produced particles are interpreted in terms of statistical physics. For a large class of distributions, the effective sizes of the system in momentum and in configuration space are shown to shrink when quantum interference is taken into account.
Interpreting Assessment Data: Statistical Techniques You Can Use
NSDL National Science Digital Library
Edwin P. Christmann
2008-11-01
*Available late fall 2008* Are you properly evaluating the results of the tests you give to students? Can you explain the difference between classroom assessment and standardized assessment? Are you on solid ground with your grading system? Demystify--and even use--statistics to answer these important questions and more in this clear, easy-to-use text for preservice and classroom science teachers and methods professors. The text's practical approach helps teachers understand how to interpret student assessments statistically and how to measure and explain the validity and reliability of those assessments. Included are a global history of testing to its present state and valuable instructions for using graphing calculators for easy computing. This nonthreatening framework for measuring and interpreting assessment results is a must-have for your professional development library.
Pang, Siu-Kwong
2013-09-01
A statistically significant and interpretable relationship between electrophilicity as a redox reactivity indicator and LD50 as a lethality indicator of drugs was discovered, and this relationship could be interpreted by the action of the cytochrome P450. The drugs chosen in this study were Topoisomerase II inhibitor anticancer drugs, and the electrophilicity of drugs was obtained by quantum chemical calculation. Since the P450 detoxification mechanism is the catalytic oxidation of drug molecules, it may infer that the drug molecules being easily oxidized (low electrophilicity) will be weak in lethality in general. In addition, this relationship revealed two structural scaffolds for the anthracycline-based topoisomerase II inhibitors, and their lethality mechanisms are not totally the same. Such relationship can assist in designing new drugs that candidates possessing low electrophilicity are recommended for lowering of lethality, and moieties providing a large inductive effect can reduce the electrophilicity of the anthracycline-based topoisomerase II inhibitors. PMID:23826857
Quantum statistics as geometry: Conflict, Mechanism, Interpretation, and Implication
Daniel C. Galehouse
2015-01-29
The conflict between the determinism of geometry in general relativity and the essential statistics of quantum mechanics blocks the development of a unified theory. Electromagnetic radiation is essential to both fields and supplies a common meeting ground. It is proposed that a suitable mechanism to resolve these differences can be based on the use of a time-symmetric treatment for the radiation. Advanced fields of the absorber can be interpreted to supply the random character of spontaneous emission. This allows the statistics of the Born rule to come from the spontaneous emission that occurs during a physical measurement. When the absorber is included, quantum mechanics is completely deterministic. It is suggested that the peculiar properties of kaons may be induced by the advanced effects of the neutrino field. Schr\\"odinger's cat loses its enigmatic personality and the identification of mental processes as an essential component of a measurement is no longer needed.
Interpreting health statistics for policymaking: the story behind the headlines.
Walker, Neff; Bryce, Jennifer; Black, Robert E
2007-03-17
Politicians, policymakers, and public-health professionals make complex decisions on the basis of estimates of disease burden from different sources, many of which are "marketed" by skilled advocates. To help people who rely on such statistics make more informed decisions, we explain how health estimates are developed, and offer basic guidance on how to assess and interpret them. We describe the different levels of estimates used to quantify disease burden and its correlates; understanding how closely linked a type of statistic is to disease and death rates is crucial in designing health policies and programmes. We also suggest questions that people using such statistics should ask and offer tips to help separate advocacy from evidence-based positions. Global health agencies have a key role in communicating robust estimates of disease, as do policymakers at national and subnational levels where key public-health decisions are made. A common framework and standardised methods, building on the work of Child Health Epidemiology Reference Group (CHERG) and others, are urgently needed. PMID:17368157
Workplace statistical literacy for teachers: interpreting box plots
NASA Astrophysics Data System (ADS)
Pierce, Robyn; Chick, Helen
2013-06-01
As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the appropriate knowledge and experience to interpret the graphs, tables and other data that they receive. This study examined the statistical literacy demands placed on teachers, with a particular focus on box plot representations. Although box plots summarise the data in a way that makes visual comparisons possible across sets of data, this study showed that teachers do not always have the necessary fluency with the representation to describe correctly how the data are distributed in the representation. In particular, a significant number perceived the size of the regions of the box plot to be depicting frequencies rather than density, and there were misconceptions associated with outlying data that were not displayed on the plot. As well, teachers' perceptions of box plots were found to relate to three themes: attitudes, perceived value and misconceptions.
Merhav, Neri
An Identity of Chernoff Bounds with an Interpretation in Statistical Physics and Applications in statistical physics, namely, an isothermal equilibrium of a composite system that consists of multiple interpretation. This results in several relationships between information theory and statistical physics, which
Statistics 586: Interpretation of Data I Spring 2014
Jornsten, Rebecka
Analysis - Univariate Statistics Summary Statistics in Graphical Display: - Stem and Leaf Displays - Shape existing evidence on which to build hypothesis and now why testing) B. Methods - Explain variables
Statistical learning for decision making : interpretability, uncertainty, and inference
Letham, Benjamin
2015-01-01
Data and predictive modeling are an increasingly important part of decision making. Here we present advances in several areas of statistical learning that are important for gaining insight from large amounts of data, and ...
Workplace Statistical Literacy for Teachers: Interpreting Box Plots
ERIC Educational Resources Information Center
Pierce, Robyn; Chick, Helen
2013-01-01
As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…
Mixture Interpretation Invited Lecture for Towson University Forensic Statistics Course
Quality Assurance Standards (QAS) · QAS Standard 5.3.2 A casework CODIS administrator shall be or have define quality assurance parameters and interpretation guidelines, including as applicable, guidelines Angie Dolph Joanne B. Sgueglia Tim Kalafut Wash State Police Crime Lab Marshall University (NIST Summer
A novel statistical analysis and interpretation of flow cytometry data
Banks, H.T.; Kapraun, D.F.; Thompson, W. Clayton; Peligero, Cristina; Argilaguet, Jordi; Meyerhans, Andreas
2013-01-01
A recently developed class of models incorporating the cyton model of population generation structure into a conservation-based model of intracellular label dynamics is reviewed. Statistical aspects of the data collection process are quantified and incorporated into a parameter estimation scheme. This scheme is then applied to experimental data for PHA-stimulated CD4+ T and CD8+ T cells collected from two healthy donors. This novel mathematical and statistical framework is shown to form the basis for accurate, meaningful analysis of cellular behaviour for a population of cells labelled with the dye carboxyfluorescein succinimidyl ester and stimulated to divide. PMID:23826744
The Coefficients of a Maximum Contrast as Interpretable Statistics.
ERIC Educational Resources Information Center
Hollingsworth, Holly
A fundamental fact of the analysis of variance statistical procedure is that if the omnibus F test of an effect is significant, then there exists at least one contrast of that effect that will be significantly different from zero according to the S-method of Scheffe. The caveat to this rule is that the significant contrast(s) may not be of any…
Microarray experiments: New statistical tools facilitate biological interpretation
Breitling, Rainer
, Anna Amtmann and Pawel Herzyk Sir Henry Wellcome Functional Genomics Facility, Plant Sciences Group that provide fast, easy, and statistically rigorous assistance during that process. 1. Rank Products (RP the regulated genes (darker shading indicates stronger regulation), white boxes show the substrates
Statistical Interpretation of Natural and Technological Hazards in China
2010-01-01
China is prone to catastrophic natural hazards from floods, droughts, earthquakes, storms, cyclones, landslides, epidemics, extreme temperatures, forest fires, avalanches, and even tsunami. This paper will list statistics related to the six worst natural disasters in China over the past 100 or so years, ranked according to number of fatalities. The corresponding data for the six worst natural disasters in
Statistical characteristics of MST radar echoes and its interpretation
NASA Technical Reports Server (NTRS)
Woodman, Ronald F.
1989-01-01
Two concepts of fundamental importance are reviewed: the autocorrelation function and the frequency power spectrum. In addition, some turbulence concepts, the relationship between radar signals and atmospheric medium statistics, partial reflection, and the characteristics of noise and clutter interference are discussed.
Need for Caution in Interpreting Extreme Weather Statistics
NASA Astrophysics Data System (ADS)
Sardeshmukh, P. D.; Compo, G. P.; Penland, M. C.
2011-12-01
Given the substantial anthropogenic contribution to 20th century global warming, it is tempting to seek an anthropogenic component in any unusual recent weather event, or more generally in any observed change in the statistics of extreme weather. This study cautions that such detection and attribution efforts may, however, very likely lead to wrong conclusions if the non-Gaussian aspects of the probability distributions of observed daily atmospheric variations, especially their skewness and heavy tails, are not explicitly taken into account. Departures of three or more standard deviations from the mean, although rare, are far more common in such a non-Gaussian world than they are in a Gaussian world. This exacerbates the already difficult problem of establishing the significance of changes in extreme value probabilities from historical climate records of limited length, using either raw histograms or Generalized Extreme Value (GEV) distributions fitted to the sample extreme values. A possible solution is suggested by the fact that the non-Gaussian aspects of the observed distributions are well captured by a general class of "Stochastically Generated Skewed distributions" (SGS distributions) recently introduced in the meteorological literature by Sardeshmukh and Sura (J. Climate 2009). These distributions arise from simple modifications to a red noise process and reduce to Gaussian distributions under appropriate limits. As such, they represent perhaps the simplest physically based non-Gaussian prototypes of the distributions of daily atmospheric variations. Fitting such SGS distributions to all (not just the extreme) values in 25, 50, or 100-yr daily records also yields corresponding extreme value distributions that are much less prone to sampling uncertainty than GEV distributions. For both of the above reasons, SGS distributions provide an attractive alternative for assessing the significance of changes in extreme weather statistics (including changes in the statistics of extreme precipitation events) over the 20th century, and of the changes projected over the 21st century.
Interpreting the flock algorithm from a statistical perspective.
Anderson, Eric C; Barry, Patrick D
2015-09-01
We show that the algorithm in the program flock (Duchesne & Turgeon 2009) can be interpreted as an estimation procedure based on a model essentially identical to the structure (Pritchard et al. 2000) model with no admixture and without correlated allele frequency priors. Rather than using MCMC, the flock algorithm searches for the maximum a posteriori estimate of this structure model via a simulated annealing algorithm with a rapid cooling schedule (namely, the exponent on the objective function ??). We demonstrate the similarities between the two programs in a two-step approach. First, to enable rapid batch processing of many simulated data sets, we modified the source code of structure to use the flock algorithm, producing the program flockture. With simulated data, we confirmed that results obtained with flock and flockture are very similar (though flockture is some 200 times faster). Second, we simulated multiple large data sets under varying levels of population differentiation for both microsatellite and SNP genotypes. We analysed them with flockture and structure and assessed each program on its ability to cluster individuals to their correct subpopulation. We show that flockture yields results similar to structure albeit with greater variability from run to run. flockture did perform better than structure when genotypes were composed of SNPs and differentiation was moderate (FST= 0.022-0.032). When differentiation was low, structure outperformed flockture for both marker types. On large data sets like those we simulated, it appears that flock's reliance on inference rules regarding its 'plateau record' is not helpful. Interpreting flock's algorithm as a special case of the model in structure should aid in understanding the program's output and behaviour. PMID:25913195
A statistical model for interpreting computerized dynamic posturography data
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.
2002-01-01
Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
ERIC Educational Resources Information Center
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate unless "corrected" effect…
CS 803 Notes, J. Kosecka, GMU, Spring 2005 EM k-means algorithm (statistical interpretation)
Kosecka, Jana
CS 803 Notes, J. KoseckÂ´a, GMU, Spring 2005 EM k-means algorithm (statistical interpretation) We clusters. The index of a cluster will be modeled as a discrete random variable z = j, such that the probability of each cluster is p(z = j) = j for j = 1, . . . n s.t. 1 + . . . n = 1 which is the standard
Material Phase Causality or a Dynamics-Statistical Interpretation of Quantum Mechanics
Koprinkov, I. G. [Department of Applied Physics, Technical University of Sofia, 1756 Sofia (Bulgaria)
2010-11-25
The internal phase dynamics of a quantum system interacting with an electromagnetic field is revealed in details. Theoretical and experimental evidences of a causal relation of the phase of the wave function to the dynamics of the quantum system are presented sistematically for the first time. A dynamics-statistical interpretation of the quantum mechanics is introduced.
Dotto, G L; Pinto, L A A; Hachicha, M A; Knani, S
2015-03-15
In this work, statistical physics treatment was employed to study the adsorption of food dyes onto chitosan films, in order to obtain new physicochemical interpretations at molecular level. Experimental equilibrium curves were obtained for the adsorption of four dyes (FD&C red 2, FD&C yellow 5, FD&C blue 2, Acid Red 51) at different temperatures (298, 313 and 328 K). A statistical physics formula was used to interpret these curves, and the parameters such as, number of adsorbed dye molecules per site (n), anchorage number (n'), receptor sites density (NM), adsorbed quantity at saturation (N asat), steric hindrance (?), concentration at half saturation (c1/2) and molar adsorption energy (?E(a)) were estimated. The relation of the above mentioned parameters with the chemical structure of the dyes and temperature was evaluated and interpreted. PMID:25308634
Two Easily Made Astronomical Telescopes.
ERIC Educational Resources Information Center
Hill, M.; Jacobs, D. J.
1991-01-01
The directions and diagrams for making a reflecting telescope and a refracting telescope are presented. These telescopes can be made by students out of plumbing parts and easily obtainable, inexpensive, optical components. (KR)
Statistical issues in the design, analysis and interpretation of animal carcinogenicity studies.
Haseman, J K
1984-01-01
Statistical issues in the design, analysis and interpretation of animal carcinogenicity studies are discussed. In the area of experimental design, issues that must be considered include randomization of animals, sample size considerations, dose selection and allocation of animals to experimental groups, and control of potentially confounding factors. In the analysis of tumor incidence data, survival differences among groups should be taken into account. It is important to try to distinguish between tumors that contribute to the death of the animal and "incidental" tumors discovered at autopsy in an animal dying of an unrelated cause. Life table analyses (appropriate for lethal tumors) and incidental tumor tests (appropriate for nonfatal tumors) are described, and the utilization of these procedures by the National Toxicology Program is discussed. Despite the fact that past interpretations of carcinogenicity data have tended to focus on pairwise comparisons in general and high-dose effects in particular, the importance of trend tests should not be overlooked, since these procedures are more sensitive than pairwise comparisons to the detection of carcinogenic effects. No rigid statistical "decision rule" should be employed in the interpretation of carcinogenicity data. Although the statistical significance of an observed tumor increase is perhaps the single most important piece of evidence used in the evaluation process, a number of biological factors must also be taken into account. The use of historical control data, the false-positive issue and the interpretation of negative trends are also discussed. PMID:6525993
NASA Astrophysics Data System (ADS)
Bellac, Michel Le
2014-11-01
Although nobody can question the practical efficiency of quantum mechanics, there remains the serious question of its interpretation. As Valerio Scarani puts it, "We do not feel at ease with the indistinguishability principle (that is, the superposition principle) and some of its consequences." Indeed, this principle which pervades the quantum world is in stark contradiction with our everyday experience. From the very beginning of quantum mechanics, a number of physicists--but not the majority of them!--have asked the question of its "interpretation". One may simply deny that there is a problem: according to proponents of the minimalist interpretation, quantum mechanics is self-sufficient and needs no interpretation. The point of view held by a majority of physicists, that of the Copenhagen interpretation, will be examined in Section 10.1. The crux of the problem lies in the status of the state vector introduced in the preceding chapter to describe a quantum system, which is no more than a symbolic representation for the Copenhagen school of thought. Conversely, one may try to attribute some "external reality" to this state vector, that is, a correspondence between the mathematical description and the physical reality. In this latter case, it is the measurement problem which is brought to the fore. In 1932, von Neumann was first to propose a global approach, in an attempt to build a purely quantum theory of measurement examined in Section 10.2. This theory still underlies modern approaches, among them those grounded on decoherence theory, or on the macroscopic character of the measuring apparatus: see Section 10.3. Finally, there are non-standard interpretations such as Everett's many worlds theory or the hidden variables theory of de Broglie and Bohm (Section 10.4). Note, however, that this variety of interpretations has no bearing whatsoever on the practical use of quantum mechanics. There is no controversy on the way we should use quantum mechanics!
Impact of equity models and statistical measures on interpretations of educational reform
NASA Astrophysics Data System (ADS)
Rodriguez, Idaykis; Brewe, Eric; Sawtelle, Vashti; Kramer, Laird H.
2012-12-01
We present three models of equity and show how these, along with the statistical measures used to evaluate results, impact interpretation of equity in education reform. Equity can be defined and interpreted in many ways. Most equity education reform research strives to achieve equity by closing achievement gaps between groups. An example is given by the study by Lorenzo et al. that shows that interactive engagement methods lead to increased gender equity. In this paper, we reexamine the results of Lorenzo et al. through three models of equity. We find that interpretation of the results strongly depends on the model of equity chosen. Further, we argue that researchers must explicitly state their model of equity as well as use effect size measurements to promote clarity in education reform.
Eroglu, Sertac
2013-01-01
The distribution behavior dictated by the Menzerath-Altmann (MA) law is frequently encountered in linguistic and natural organizations at various structural levels. The mathematical form of this empirical law comprises three fitting parameters whose values tend to be elusive, especially in inter-organizational studies. To allow interpretation of these parameters and better understand such distribution behavior, we present a statistical mechanical approach based on an analogy between the classical particles of a statistical mechanical organization and the number of distinct words in a textual organization. With this derivation, we achieve a transformed (generalized) form of the MA model, termed the statistical mechanical Menzerath-Altmann (SMMA) model. This novel transformed model consists of four parameters, one of which is a structure-dependent input parameter, and three of which are free-fitting parameters. Using distinct word data sets from two text corpora, we verified that the SMMA model describes the sa...
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
NASA Astrophysics Data System (ADS)
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
NASA Technical Reports Server (NTRS)
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
Misuse of statistics in the interpretation of data on low-level radiation
Hamilton, L.D.
1982-01-01
Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.
Soil VisNIR chemometric performance statistics should be interpreted as random variables
NASA Astrophysics Data System (ADS)
Brown, David J.; Gasch, Caley K.; Poggio, Matteo; Morgan, Cristine L. S.
2015-04-01
Chemometric models are normally evaluated using performance statistics such as the Standard Error of Prediction (SEP) or the Root Mean Squared Error of Prediction (RMSEP). These statistics are used to evaluate the quality of chemometric models relative to other published work on a specific soil property or to compare the results from different processing and modeling techniques (e.g. Partial Least Squares Regression or PLSR and random forest algorithms). Claims are commonly made about the overall success of an application or the relative performance of different modeling approaches assuming that these performance statistics are fixed population parameters. While most researchers would acknowledge that small differences in performance statistics are not important, rarely are performance statistics treated as random variables. Given that we are usually comparing modeling approaches for general application, and given that the intent of VisNIR soil spectroscopy is to apply chemometric calibrations to larger populations than are included in our soil-spectral datasets, it is more appropriate to think of performance statistics as random variables with variation introduced through the selection of samples for inclusion in a given study and through the division of samples into calibration and validation sets (including spiking approaches). Here we look at the variation in VisNIR performance statistics for the following soil-spectra datasets: (1) a diverse US Soil Survey soil-spectral library with 3768 samples from all 50 states and 36 different countries; (2) 389 surface and subsoil samples taken from US Geological Survey continental transects; (3) the Texas Soil Spectral Library (TSSL) with 3000 samples; (4) intact soil core scans of Texas soils with 700 samples; (5) approximately 400 in situ scans from the Pacific Northwest region; and (6) miscellaneous local datasets. We find the variation in performance statistics to be surprisingly large. This has important implications for the interpretation of soil VisNIR model results. Particularly for smaller datasets, the relative success of a given application or modeling approach may well be due in part to chance.
NASA Astrophysics Data System (ADS)
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Shafieloo, Arman, E-mail: arman@ewha.ac.kr [Institute for the Early Universe, Ewha Womans University, Seoul, 120-750 (Korea, Republic of)
2012-05-01
By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties.
Gilbert, Peter B; Berger, James O; Stablein, Donald; Becker, Stephen; Essex, Max; Hammer, Scott M; Kim, Jerome H; Degruttola, Victor G
2011-04-01
Recently, the RV144 randomized, double-blind, efficacy trial in Thailand reported that a prime-boost human immunodeficiency virus (HIV) vaccine regimen conferred ?30% protection against HIV acquisition. However, different analyses seemed to give conflicting results, and a heated debate ensued as scientists and the broader public struggled with their interpretation. The lack of accounting for statistical principles helped flame the debate, and we leverage these principles to provide a more scientific interpretation. We first address interpretation of frequentist results, including interpretation of P values, synthesis of results from multiple analyses (ie, intention-to-treat versus per-protocol/fully immunized), and accounting for external efficacy trials. Second, we address how Bayesian statistics, which provide clearly interpretable statements about probabilities that the vaccine efficacy takes certain values, provide more information for weighing the evidence about efficacy than do frequentist statistics alone. Third, we evaluate RV144 for completeness of end point ascertainment and integrity of blinding, necessary tasks for establishing robustly interpretable results. PMID:21402548
Wo?nicka, U; Jarzyna, J; Krynicka, E
2005-05-01
Measurements of various physical quantities in a borehole by geophysical well logging tools are designed to determine these quantities for underground geological formations. Then, the raw data (logs) are combined in a comprehensive interpretation to obtain values of geological parameters. Estimating the uncertainty of calculated geological parameters, interpreted in such a way, is difficult, often impossible, when classical statistical methods are used. The method presented here permits an estimate of the uncertainty of a quantity to be obtained. The discussion of the dependence between the uncertainty of nuclear and acoustic tool responses, and the estimated uncertainty of the interpreted geological parameters (among others: porosity, water saturation, clay content) is presented. PMID:15763490
Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma
2015-11-01
When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies. PMID:26036185
Cho, Kyung Hwa; Park, Yongeun; Kang, Joo-Hyon; Ki, Seo Jin; Cha, Sungmin; Lee, Seung Won; Kim, Joon Ha
2009-01-01
The Yeongsan (YS) Reservoir is an estuarine reservoir which provides surrounding areas with public goods, such as water supply for agricultural and industrial areas and flood control. Beneficial uses of the YS Reservoir, however, are recently threatened by enriched non-point and point source inputs. A series of multivariate statistical approaches including principal component analysis (PCA) were applied to extract significant characteristics contained in a large suite of water quality data (18 variables monthly recorded for 5 years); thereby to provide the important phenomenal information for establishing effective water resource management plans for the YS Reservoir. The PCA results identified the most important five principal components (PCs), explaining 71% of total variance of the original data set. The five PCs were interpreted as hydro-meteorological effect, nitrogen loading, phosphorus loading, primary production of phytoplankton, and fecal indicator bacteria (FIB) loading. Furthermore, hydro-meteorological effect and nitrogen loading could be characterized by a yearly periodicity whereas FIB loading showed an increasing trend with respect to time. The study results presented here might be useful to establish preliminary strategies for abating water quality degradation in the YS Reservoir. PMID:19494462
Jin Chen; Robert E Roth; Adam T Naito; Eugene J Lengerich; Alan M MacEachren
2008-01-01
BACKGROUND: Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster
ERIC Educational Resources Information Center
Boysen, Guy A.
2015-01-01
Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…
ERIC Educational Resources Information Center
McArthur, David; Chou, Chih-Ping
Diagnostic testing confronts several challenges at once, among which are issues of test interpretation and immediate modification of the test itself in response to the interpretation. Several methods are available for administering and evaluating a test in real-time, towards optimizing the examiner's chances of isolating a persistent pattern of…
Yu, Victoria; Kishan, Amar U.; Cao, Minsong; Low, Daniel; Lee, Percy; Ruan, Dan
2014-03-15
Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Based on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.
NASA Technical Reports Server (NTRS)
Bhatia, A. K.; Underhill, A. B.
1986-01-01
The interpretation of the intensities of the hydrogen and helium emission lines in O and Wolf-Rayet spectra in terms of the abundance of hydrogen relative to helium requires information regarding the distribution of hydrogen and helium atoms and ions over their several energy states. In addition, some estimate is needed regarding the transmission of the radiation through the stellar mantle. The present paper provides new information concerning the population of the energy levels of hydrogen and helium when statistical equilibrium occurs in the presence of a radiation field. The results are applied to an interpretation of the spectra of four Wolf-Rayet stars, taking into account the implications for interpreting the spectra of O stars, OB supergiants, and Be stars.
Phoenix, S.L.; Wu, E.M.
1983-03-01
This paper presents some new data on the strength and stress-rupture of Kevlar-49 fibers, fiber/epoxy strands and pressure vessels, and consolidated data obtained at LLNL over the past 10 years. This data are interpreted by using recent theoretical results from a micromechanical model of the statistical failure process, thereby gaining understanding of the roles of the epoxy matrix and ultraviolet radiation on long term lifetime.
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Boyle temperature as a point of ideal gas in gentile statistics and its economic interpretation
NASA Astrophysics Data System (ADS)
Maslov, V. P.; Maslova, T. V.
2014-07-01
Boyle temperature is interpreted as the temperature at which the formation of dimers becomes impossible. To Irving Fisher's correspondence principle we assign two more quantities: the number of degrees of freedom, and credit. We determine the danger level of the mass of money M when the mutual trust between economic agents begins to fall.
NASA Astrophysics Data System (ADS)
De'Michieli Vitturi, M.; Clarke, A. B.; La Spina, G.; Neri, A.
2012-12-01
In order to use real-time volcano monitoring data to interpret subsurface dynamics and predict future eruptive activity the sensitivity of forward model outcomes to uncertain or variable input parameters and processes must be assessed and corresponding uncertainty quantified systematically. In response to this issue, we have developed a framework that couples a new numerical model of magma ascent with statistical analysis tools. The model solves multiphase compressible equations governing magma movement through a subsurface pathway (from chamber to surface, for example), and represents a significant advance in terms of its quantitative description of the magma system in that it: 1) is capable of treating both dilute and dense flow regimes; 2) describes flow above and below the fragmentation level; 3) quantifies the interaction between two phases with two pressures and two velocities; 4) accounts for disequilibrium crystallization and degassing; and 5) allows for open-system degassing. We have chosen to consider these complexities because of their potential significance in controlling eruption rate and transitions in eruption regime and style. The code can be run on nearly any cluster, desktop or laptop in 1D and 2D/3D and for both transient and steady problems. Furthermore, the code is highly modular so that interested future users can easily adapt it to other multiphase fluid systems such as mud volcanoes, geysers, and petroleum industry systems. The code has been interfaced with the freeware DAKOTA system analysis toolkit, allowing effective and realistic comparisons between outcomes of models and volcano monitoring data, resulting in a probabilistic interpretation of the data in terms of subsurface dynamics and future volcanic activity. We present preliminary uncertainty quantification and sensitivity analysis results obtained for several well-documented periods of the ongoing eruption of the Soufrière Hills volcano in order to demonstrate the power of the approach.
Statistics Translated: A Step-by-Step Guide to Analyzing and Interpreting Data
ERIC Educational Resources Information Center
Terrell, Steven R.
2012-01-01
Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying…
Interpreting Assessment Data: Statistical Techniques You Can Use (e-book)
NSDL National Science Digital Library
Edwin P. Christmann
2009-06-13
Are you properly evaluating the results of the tests you give to students? Can you explain the difference between classroom assessment and standardized assessment? Are you on solid ground with your grading system? Demystify--and even use--statistics to answ
Xiaomei Song; Brian W. Pogue; Tor D. Tosteson; Troy O. McBride; Shudong Jiang; Keith D. Paulsen
2002-01-01
For pt. I see ibid., vol. 21, no. 7, p. 755-63 (2002). Image error analysis of a diffuse near-infrared tomography (NIR) system has been carried out on simulated data using a statistical approach described in pt. I of this paper (Pogue et al., 2002). The methodology is used here with experimental data acquired on phantoms with a prototype imaging system
ERIC Educational Resources Information Center
Gierl, Mark J.; Rogers, W. Todd; Klinger, Don A.
1999-01-01
Evaluates the equivalence of translated achievement tests administered to 4,400 English- and French-speaking sixth-graders. Items displaying differential item functioning were flagged using three statistical methods; results were relatively consistent across methods, but not identical. Substantive review of French items via back-translation to…
Patton, Charles J.; Gilroy, Edward J.
1999-01-01
Data on which this report is based, including nutrient concentrations in synthetic reference samples determined concurrently with those in real samples, are extensive (greater than 20,000 determinations) and have been published separately. In addition to confirming the well-documented instability of nitrite in acidified samples, this study also demonstrates that when biota are removed from samples at collection sites by 0.45-micrometer membrane filtration, subsequent preservation with sulfuric acid or mercury (II) provides no statistically significant improvement in nutrient concentration stability during storage at 4 degrees Celsius for 30 days. Biocide preservation had no statistically significant effect on the 30-day stability of phosphorus concentrations in whole-water splits from any of the 15 stations, but did stabilize Kjeldahl nitrogen concentrations in whole-water splits from three data-collection stations where ammonium accounted for at least half of the measured Kjeldahl nitrogen.
Interpreting our drug mortality statistics. Holes in the data on illegal drugs.
Sullivan, L G
1994-11-01
Estimates of mortality attributable to legal and illegal drugs are often used in the debate on legalisation as an indication of the comparative harmfulness of the drugs concerned. Yet there are few data on the health impact of illegal drugs and mortality figures are not adjusted for prevalence of drug use. The estimates therefore indicate only currently statistically assessable harm; they do not reliably express either the comparative incidence of drug-caused mortality, or their innate harmfulness. PMID:7968763
Virtual Chef Easily saves new recipes with
Virtual Chef Easily saves new recipes with filters based on difficulty, time, in-stock ingredients integrate into the kitchen space via the Virtual Chef tablet app. Find & Save Recipes Time Maintenance Hands Design Insights: RecipesTimers Tutorials 5:45 0:23 Virtual Chef Let's Cook Finally, we created high
NASA Astrophysics Data System (ADS)
Mani, Peter; Heuer, Markus; Hofmann, Beda A.; Milliken, Kitty L.; West, Julia M.
This paper evaluates a mathematical model of bio-signature search processes on Mars samples returned to Earth and studied inside a Mars Sample Return Facility (MSRF). Asimple porosity model for a returned Mars sample, based on initial observations on Mars meteorites, has been stochastically simulated and the data analysed in a computer study. The resulting false positive, true negative and false negative values - as a typical output of the simulations - was statistically analysed. The results were used in Bayes’ statistics to correct the a-priori probability of presence of bio-signature and the resulting posteriori probability was used in turn to improve the initial assumption of the value of extra-terrestrial presence for life forms in Mars material. Such an iterative algorithm can lead to a better estimate of the positive predictive value for life on Mars and therefore, together with Poisson statistics for a null result, it should be possible to bound the probability for the presence of extra-terrestrial bio-signatures to an upper level.
NASA Astrophysics Data System (ADS)
Samfira, Ionel; Boldea, Marius; Popescu, Cosmin
2012-09-01
Significant parameters of permanent grasslands are represented by the pastoral value and Shannon and Simpson biodiversity indices. The dynamics of these parameters has been studied in several plant associations in Banat Plain, Romania. From the point of view of their typology, these permanent grasslands belong to the steppe area, series Festuca pseudovina, type Festuca pseudovina-Achilea millefolium, subtype Lolium perenne. The methods used for the purpose of this research included plant cover analysis (double meter method, calculation of Shannon and Simpson indices), and statistical methods of regression and correlation. The results show that, in the permanent grasslands in the plain region, when the pastoral value is average to low, the level of interspecific biodiversity is on the increase.
Rapp, J.B.
1991-01-01
Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
NASA Astrophysics Data System (ADS)
Sibatov, R. T.
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
A COMPREHENSIVE STATISTICALLY-BASED METHOD TO INTERPRET REAL-TIME FLOWING MEASUREMENTS
Pinan Dawkrajai; Analis A. Romero; Keita Yoshioka; Ding Zhu; A.D. Hill; Larry W. Lake
2004-10-01
In this project, we are developing new methods for interpreting measurements in complex wells (horizontal, multilateral and multi-branching wells) to determine the profiles of oil, gas, and water entry. These methods are needed to take full advantage of ''smart'' well instrumentation, a technology that is rapidly evolving to provide the ability to continuously and permanently monitor downhole temperature, pressure, volumetric flow rate, and perhaps other fluid flow properties at many locations along a wellbore; and hence, to control and optimize well performance. In this first year, we have made considerable progress in the development of the forward model of temperature and pressure behavior in complex wells. In this period, we have progressed on three major parts of the forward problem of predicting the temperature and pressure behavior in complex wells. These three parts are the temperature and pressure behaviors in the reservoir near the wellbore, in the wellbore or laterals in the producing intervals, and in the build sections connecting the laterals, respectively. Many models exist to predict pressure behavior in reservoirs and wells, but these are almost always isothermal models. To predict temperature behavior we derived general mass, momentum, and energy balance equations for these parts of the complex well system. Analytical solutions for the reservoir and wellbore parts for certain special conditions show the magnitude of thermal effects that could occur. Our preliminary sensitivity analyses show that thermal effects caused by near-wellbore reservoir flow can cause temperature changes that are measurable with smart well technology. This is encouraging for the further development of the inverse model.
A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements
Keita Yoshioka; Pinan Dawkrajai; Analis A. Romero; Ding Zhu; A. D. Hill; Larry W. Lake
2007-01-15
With the recent development of temperature measurement systems, continuous temperature profiles can be obtained with high precision. Small temperature changes can be detected by modern temperature measuring instruments such as fiber optic distributed temperature sensor (DTS) in intelligent completions and will potentially aid the diagnosis of downhole flow conditions. In vertical wells, since elevational geothermal changes make the wellbore temperature sensitive to the amount and the type of fluids produced, temperature logs can be used successfully to diagnose the downhole flow conditions. However, geothermal temperature changes along the wellbore being small for horizontal wells, interpretations of a temperature log become difficult. The primary temperature differences for each phase (oil, water, and gas) are caused by frictional effects. Therefore, in developing a thermal model for horizontal wellbore, subtle temperature changes must be accounted for. In this project, we have rigorously derived governing equations for a producing horizontal wellbore and developed a prediction model of the temperature and pressure by coupling the wellbore and reservoir equations. Also, we applied Ramey's model (1962) to the build section and used an energy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases at varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section. With the prediction models developed, we present inversion studies of synthetic and field examples. These results are essential to identify water or gas entry, to guide flow control devices in intelligent completions, and to decide if reservoir stimulation is needed in particular horizontal sections. This study will complete and validate these inversion studies.
A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements
Pinan Dawkrajai; Keita Yoshioka; Analis A. Romero; Ding Zhu; A.D. Hill; Larry W. Lake
2005-10-01
This project is motivated by the increasing use of distributed temperature sensors for real-time monitoring of complex wells (horizontal, multilateral and multi-branching wells) to infer the profiles of oil, gas, and water entry. Measured information can be used to interpret flow profiles along the wellbore including junction and build section. In this second project year, we have completed a forward model to predict temperature and pressure profiles in complex wells. As a comprehensive temperature model, we have developed an analytical reservoir flow model which takes into account Joule-Thomson effects in the near well vicinity and multiphase non-isothermal producing wellbore model, and couples those models accounting mass and heat transfer between them. For further inferences such as water coning or gas evaporation, we will need a numerical non-isothermal reservoir simulator, and unlike existing (thermal recovery, geothermal) simulators, it should capture subtle temperature change occurring in a normal production. We will show the results from the analytical coupled model (analytical reservoir solution coupled with numerical multi-segment well model) to infer the anomalous temperature or pressure profiles under various conditions, and the preliminary results from the numerical coupled reservoir model which solves full matrix including wellbore grids. We applied Ramey's model to the build section and used an enthalpy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section.
Lundell, Helen C; Niederdeppe, Jeff; Clarke, Christopher E
2013-01-01
This article explores public responses to narratives and statistical images, predominantly graphs and maps, designed to raise awareness of social determinants of health and health disparities. We focus particular attention on respondents' interpretation of the complexity of health causality and the typicality of the situations described. We conducted 24 focus groups with liberal and conservative adults (n?=?180 participants) living in a large U.S. northeastern state. Although some narratives showed potential for communicating the complex causality connecting social determinants of health (SDH) to health outcomes, contextual details sometimes disrupted generalization to a broader thematic message. Statistical images often prompted useful speculation about how the factors portrayed might be related, but tended to be regarded with suspicion and criticized for oversimplifying what were perceived to be extremely complex issues. These findings lend theoretical insight to narrative and visual persuasion in the context of social issues with complex causation. We discuss practical implications for those seeking to communicate about the social determinants of health. PMID:22823526
ACECARD. Acquire CoOmmodities Easily Card
Soler, E.E.
1996-09-01
Acquire Commodities Easily Card (AceCard) provides an automated end-user method to distribute company credit card charges to internal charge numbers. AceCard will allow cardholders to record card purchases in an on-line order log, enter multiple account distributions per order that can be posted to the General Ledger, track orders, and receipt information, and provide a variety of cardholder and administrative reports. Please note: Customers must contact Ed Soler (423)-576-6151, Lockheed Martin Energy Systems, for help with the installation of the package. The fee for this installation help will be coordinated by the customer and Lockheed Martin and is in addition to cost of the package from ESTSC. Customers should contact Sandy Presley (423)-576-4708 for user help.
Easily Missed Fractures in the Lower Extremity.
Yu, Joseph S
2015-07-01
As long as radiography remains cheap and provides value in patient care, it will continue to be widely used as a front-line imaging technique. There are limitations to what a radiograph can depict, however. It is imperative to understand the limitations of radiography to avoid pitfalls owing to the overlap of numerous osseous structures. This article reminds the reader of the association between certain radiographic abnormalities and the anatomic relevance in the patient. Although interpretive errors occur in fast-paced, high-volume emergency settings, meticulous attention to changes in the cortex and medullary bone may help to keep errors to a minimum. PMID:26046508
Mixture Interpretation & Statistics Workshop
Institute of Standards and Technology or the U.S. Department of Commerce. Certain commercial products- continuous continuous does not model stochastic effects models stochastic effects observed peaks are discrete
Data Interpretation & Statistical Analysis
;Quality Assurance Standard Requirement for Literature Review 5.1.3.2. The laboratory shall have a program analysis. Quality Assurance Standards for Forensic DNA Testing Laboratories (effective September 1, 2011) http://www.fbi.gov/about-us/lab/codis/qas-standards-for-forensic-dna-testing-laboratories-effective-9
ERIC Educational Resources Information Center
Barner, David; Snedeker, Jesse
2008-01-01
Four experiments investigated 4-year-olds' understanding of adjective-noun compositionality and their sensitivity to statistics when interpreting scalar adjectives. In Experiments 1 and 2, children selected "tall" and "short" items from 9 novel objects called "pimwits" (1-9 in. in height) or from this array plus 4 taller or shorter distractor…
Elko, P P; Rowlandson, I
1992-01-01
Computerized interpretation of the electrocardiogram (ECG) for detection of acute myocardial infarction (AMI) has been an area of active investigation for the past few years. Advances in the development of criteria for increased accuracy have resulted through the use of clinically correlated databases. Previously, using such databases, the sensitivity for interpretation of AMI in the Marquette 12SL ECG analysis program has increased from 21% to 65% with specificity remaining unchanged (99%). This study attempted to find measurements of the QRS and ST-segment from 7 of the 12 standard ECG leads to increase the sensitivity of detection of anterior AMI to the level of a trained physician while maintaining the current level of specificity. Regression analyses were performed on the measurements to see which ones could improve sensitivity and what effect they had on specificity. There was no clear separation of the individual measurements between the normal database or the true positive and true negative anterior AMI databases for maintaining high specificity. In a parallel study of the same data, deterministic criteria combining both ST and T wave information increased the sensitivity of the 12SL analysis program for detection of anterior AMI to 71% on a clinically correlated anterior AMI database and 75% on a physician interpreted anterior AMI database while maintaining the specificity at 99%. PMID:1297676
The use of easily debondable orthodontic adhesives with ceramic brackets.
Ryu, Chiyako; Namura, Yasuhiro; Tsuruoka, Takashi; Hama, Tomohiko; Kaji, Kaori; Shimizu, Noriyoshi
2011-01-01
We experimentally produced an easily debondable orthodontic adhesive (EDA) containing heat-expandable microcapsules. The purpose of this in vitro study was to evaluate the best debondable condition when EDA was used for ceramic brackets. Shear bond strengths were measured before and after heating and were compared statistically. Temperatures of the bracket base and pulp wall were also examined during heating. Bond strengths of EDA containing 30 wt% and 40 wt% heat-expandable microcapsules were 13.4 and 12.9 MPa, respectively and decreased significantly to 3.8 and 3.7 MPa, respectively, after heating. The temperature of the pulp wall increased 1.8-3.6°C after heating, less than that required to induce pulp damage. Based on the results, we conclude that heating for 8 s during debonding of ceramic brackets bonded using EDA containing 40 wt% heat-expandable microcapsules is the most effective and safest method for the enamel and pulp. PMID:21946484
Quantum of area {Delta}A=8{pi}l{sub P}{sup 2} and a statistical interpretation of black hole entropy
Ropotenko, Kostiantyn [State Administration of Communications, Ministry of Transport and Communications of Ukraine, 22, Khreschatyk, 01001, Kyiv (Ukraine)
2010-08-15
In contrast to alternative values, the quantum of area {Delta}A=8{pi}l{sub P}{sup 2} does not follow from the usual statistical interpretation of black hole entropy; on the contrary, a statistical interpretation follows from it. This interpretation is based on the two concepts: nonadditivity of black hole entropy and Landau quantization. Using nonadditivity a microcanonical distribution for a black hole is found and it is shown that the statistical weight of a black hole should be proportional to its area. By analogy with conventional Landau quantization, it is shown that quantization of a black hole is nothing but the Landau quantization. The Landau levels of a black hole and their degeneracy are found. The degree of degeneracy is equal to the number of ways to distribute a patch of area 8{pi}l{sub P}{sup 2} over the horizon. Taking into account these results, it is argued that the black hole entropy should be of the form S{sub bh}=2{pi}{center_dot}{Delta}{Gamma}, where the number of microstates is {Delta}{Gamma}=A/8{pi}l{sub P}{sup 2}. The nature of the degrees of freedom responsible for black hole entropy is elucidated. The applications of the new interpretation are presented. The effect of noncommuting coordinates is discussed.
ERIC Educational Resources Information Center
Cruce, Ty M.
2009-01-01
This methodological note illustrates how a commonly used calculation of the Delta-p statistic is inappropriate for categorical independent variables, and this note provides users of logistic regression with a revised calculation of the Delta-p statistic that is more meaningful when studying the differences in the predicted probability of an…
Kawabe, Takahiro
2013-01-01
Humans can acquire the statistical features of the external world and employ them to control behaviors. Some external events occur in harmony with an agent's action, and thus, humans should also be able to acquire the statistical features between an action and its external outcome. We report that the acquired action-outcome statistical features alter the visual appearance of the action outcome. Pressing either of two assigned keys triggered visual motion whose direction was statistically biased either upward or downward, and observers judged the stimulus motion direction. Points of subjective equality (PSE) for judging motion direction were shifted repulsively from the mean of the distribution associated with each key. Our Bayesian model accounted for the PSE shifts, indicating the optimal acquisition of the action-effect statistical relation. The PSE shifts were moderately attenuated when the action-outcome contingency was reduced. The Bayesian model again accounted for the attenuated PSE shifts. On the other hand, when the action-outcome contiguity was reduced, the PSE shifts were greatly attenuated, and however, the Bayesian model could not accounted for the shifts. The results indicate that visual appearance can be modified by prediction based on the optimal acquisition of action-effect causal relation. PMID:24093017
The primary objectives of Phase I of the National Surface Water Survey were to determine the number of acidic or potentially acidic lakes and streams, their location, and their physical and chemical characteristics. To meet these objectives, a statistically designed survey was im...
NASA Astrophysics Data System (ADS)
Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo
2009-07-01
Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean = 35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets.
Kim, Kyoung-Ho; Yun, Seong-Taek; Choi, Byoung-Young; Chae, Gi-Tak; Joo, Yongsung; Kim, Kangjoo; Kim, Hyoung-Soo
2009-07-21
Hydrochemical and multivariate statistical interpretations of 16 physicochemical parameters of 45 groundwater samples from a riverside alluvial aquifer underneath an agricultural area in Osong, central Korea, were performed in this study to understand the spatial controls of nitrate concentrations in terms of biogeochemical processes occurring near oxbow lakes within a fluvial plain. Nitrate concentrations in groundwater showed a large variability from 0.1 to 190.6 mg/L (mean=35.0 mg/L) with significantly lower values near oxbow lakes. The evaluation of hydrochemical data indicated that the groundwater chemistry (especially, degree of nitrate contamination) is mainly controlled by two competing processes: 1) agricultural contamination and 2) redox processes. In addition, results of factorial kriging, consisting of two steps (i.e., co-regionalization and factor analysis), reliably showed a spatial control of the concentrations of nitrate and other redox-sensitive species; in particular, significant denitrification was observed restrictedly near oxbow lakes. The results of this study indicate that sub-oxic conditions in an alluvial groundwater system are developed geologically and geochemically in and near oxbow lakes, which can effectively enhance the natural attenuation of nitrate before the groundwater discharges to nearby streams. This study also demonstrates the usefulness of multivariate statistical analysis in groundwater study as a supplementary tool for interpretation of complex hydrochemical data sets. PMID:19524319
Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S.; Vives-Rego, Josep
2010-01-01
We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754
Julià, Olga; Vidal-Mas, Jaume; Panikov, Nicolai S; Vives-Rego, Josep
2010-01-01
We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting. PMID:20592754
So Hee Yoon; Yeon Ji Chung; Myung Soo Kim
2008-01-01
Time-evolution of product ion signals in ultraviolet photodissociation (UV-PD) of singly protonated peptides with an arginine\\u000a at the N-terminus was investigated by using a tandem time-of-flight mass spectrometer equipped with a cell floated at high\\u000a voltage. Observation of different time-evolution patterns for different product ion types—an apparently nonstatistical behavior—could\\u000a be explained within the statistical framework by invoking consecutive formation of
NASA Astrophysics Data System (ADS)
Kim, Ji-Soo; Han, Soo-Hyung; Ryang, Woo-Hun
2001-12-01
Electrical resistivity mapping was conducted to delineate boundaries and architecture of the Eumsung Basin Cretaceous. Basin boundaries are effectively clarified in electrical dipole-dipole resistivity sections as high-resistivity contrast bands. High resistivities most likely originate from the basement of Jurassic granite and Precambrian gneiss, contrasting with the lower resistivities from infilled sedimentary rocks. The electrical properties of basin-margin boundaries are compatible with the results of vertical electrical soundings and very-low-frequency electromagnetic surveys. A statistical analysis of the resistivity sections is tested in terms of standard deviation and is found to be an effective scheme for the subsurface reconstruction of basin architecture as well as the surface demarcation of basin-margin faults and brittle fracture zones, characterized by much higher standard deviation. Pseudo three-dimensional architecture of the basin is delineated by integrating the composite resistivity structure information from two cross-basin E-W magnetotelluric lines and dipole-dipole resistivity lines. Based on statistical analysis, the maximum depth of the basin varies from about 1 km in the northern part to 3 km or more in the middle part. This strong variation supports the view that the basin experienced pull-apart opening with rapid subsidence of the central blocks and asymmetric cross-basinal extension.
Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol
2015-09-01
A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. PMID:25261588
Quintana, C; Bonnet, N
1994-01-01
For biological X-ray microanalysis, cryoembedding (CE) combined with cryofixation (CF) and cryodehydration (CD) was already proposed as an alternative method to freeze-dried cryosections in 1984 by Wróblewski and Wroblewski. CD by freeze-drying (FD) is usually recommended because it provides better retention of diffusible elements. CD by freeze-substitution (FS) has the advantage of being simpler, giving more reproducible preservation of ultrastructure and causing fewer problems for resin infiltration. We have increased the retention of diffusible elements by using home-made devices for CS and CE in the new Lowicryl K11M and HM23 resins. These resins allow samples to be kept at a maximum temperature of 213K and 193K respectively. Application of multivariate statistical analysis (MSA) to X-ray data (spectra and maps) allows the study of correlations between the analyzed elements in different nuclear areas and in the cytoplasm. The "factorial" images, obtained with MSA, display the compartments of strong correlation between P and K (nucleic acids) and the compartments of strong correlation between S and K (proteins). We suggest that the future application of MSA methods will provide increased knowledge of the physio-pathological compartmentation of diffusible elements at the subcellular level. PMID:7638503
NASA Astrophysics Data System (ADS)
Alpert, P. A.; Knopf, D. A.
2014-12-01
Ice nucleation is the initial step in forming mixed-phase and cirrus clouds, and is well established as an important influence on global climate. Laboratory studies investigate at which cloud relevant conditions of temperature (T) and relative humidity (RH) ice nucleation occurs and as a result, numerous fundamentally different ice nucleation descriptions have been proposed for implementation in cloud and climate models. We introduce a new immersion freezing model based on first principles of statistics to simulate individual droplet freezing requiring only three experimental parameters, which are the total number of droplets, the uncertainty of applied surface area per droplet, and the heterogeneous ice nucleation rate coefficient, Jhet, as a function as a function of T and water activity (aw), where in equilibrium RH=aw. Previous studies reporting frozen fractions (f) or Jhet for a droplet population are described by our model for mineral, inorganic, organic, and biological ice nuclei and different techniques including cold stage, oil-immersion, continuous flow diffusion chamber, flow tube, cloud chamber, acoustic levitation and wind levitation experiments. Taking advantage of the physically based parameterization of Jhet by Knopf and Alpert (Faraday Discuss., 165, 513-534, 2013), our model can predict immersion freezing for the entire atmospherically relevant range of T, RH, particle surface area, and time scales, even for conditions unattainable in a laboratory setting. Lastly, we present a rigorous experimental uncertainty analysis using a Monte Carlo method of laboratory derived Jhet and f. These results imply that classical nucleation theory is universal for immersion freezing. In combination with a aw based description of Jhet, this approach allows for a physically based and computational little demanding implementation in climate and cloud models.
Dziurkowska, Ewelina; Wesolowski, Marek
2015-01-01
Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods. PMID:26380376
Easily disassembled electrical connector for high voltage, high frequency connections
Milner, J.R.
1994-05-10
An easily accessible electrical connector capable of rapid assembly and disassembly is described wherein a wide metal conductor sheet may be evenly contacted over the entire width of the conductor sheet by opposing surfaces on the connector which provide an even clamping pressure against opposite surfaces of the metal conductor sheet using a single threaded actuating screw. 13 figures.
Novel Cyclic Sugar Imines: Carbohydrate Mimics and Easily
Davis, Ben G.
Novel Cyclic Sugar Imines: Carbohydrate Mimics and Easily Elaborated Scaffolds for Aza (e.g., DNJ) imines not only are potential carbohydrate- processing enzyme inhibitors that may for a nitrogen atom.1 The often potent inhibitory activity of many of these compounds toward carbohydrate
Easily disassembled electrical connector for high voltage, high frequency connections
Milner, Joseph R. (Livermore, CA)
1994-01-01
An easily accessible electrical connector capable of rapid assembly and disassembly wherein a wide metal conductor sheet may be evenly contacted over the entire width of the conductor sheet by opposing surfaces on the connector which provide an even clamping pressure against opposite surfaces of the metal conductor sheet using a single threaded actuating screw.
Method to more easily identify oil tax base
Blue, B.E.
1986-01-20
A method has been developed which can more easily identify the appropriate crude oil tax base by determining a ''threshhold price'' of oil, i.e., a price above which the net income limitation (NIL) applies, and below which windfall profit applies. This price can serve as a convenient measure of likelihood of benefitting from NIL and can be easily recomputed as any of its variables change to provide a better view of tax effects. In 1980, the Windfall Profit Tax (WPT) Act was enacted, providing for a federal excise tax on the production of most domestic crude oil. This tax is based on the lower of two amounts: the ''windfall profit'' or ''net income limitation.'' These two amounts differ greatly in their computation yet each must be considered in order to pay the minimum possible tax.
Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy
2015-10-15
The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization both in 2-D and 1-D was undertaken. Results suggest that the iCFD model developed for the SST through the proposed methodology is able to predict solid distribution with high accuracy - taking a reasonable computational effort - when compared to multi-dimensional numerical experiments, under a wide range of flow and design conditions. iCFD tools could play a crucial role in reliably predicting systems' performance under normal and shock events. PMID:26248321
Easily installable behavioral monitoring system with electric field sensor.
Tsukamoto, Sosuke; Machida, Yuichiro; Kameda, Noriyuki; Hoshino, Hiroshi; Tamura, Toshiyo
2007-01-01
This paper describes a wireless behavioral monitoring system equipped with an electric field sensor. The sensor unit was designed to obtain information regarding the usage of home electric appliances such as the television, microwave oven, coffee maker, etc. by measuring the electric field surrounding them. It is assumed that these usage statistics could provide information regarding the indoor behavior of a subject. Since the sensor can be used by simply attaching it to an appliance and does not require any wiring for its installation, this system can be temporarily installed in any ordinary house. A simple interface for selecting the threshold value of appliances' power on/off states was introduced. The experimental results reveal that the proposed system can be installed by individuals in their residences in a short time and the usage statistics of home appliances can be gathered. PMID:18002891
Serotonin syndrome: a complex but easily avoidable condition.
Dvir, Yael; Smallwood, Patrick
2008-01-01
Serotonin syndrome is a potentially life-threatening adverse drug reaction caused by excessive serotonergic agonism in central and peripheral nervous system serotonergic receptors (Boyer EW, Shannon M. The serotonin syndrome. N Engl J Med 2005;352:1112-1120). Symptoms are characterized by a triad of neuron-excitatory features, which include (a) neuromuscular hyperactivity -- tremor, clonus, myoclonus, hyperreflexia and, in advanced stages, pyramidal rigidity; (b) autonomic hyperactivity -- diaphoresis, fever, tachycardia and tachypnea; (c) altered mental status -- agitation, excitement and, in advanced stages, confusion (Gillman PK. Monoamine oxidase inhibitors, opioid analgesics and serotonin toxicity. Br J Anaesth 2005;95:434-441). It arises when pharmacological agents increase serotonin neurotransmission at postsynaptic 5-hydroxytryptamine 1A and 5-hydroxytryptamine 2A receptors through increased serotonin synthesis, decreased serotonin metabolism, increased serotonin release, inhibition of serotonin reuptake or direct agonism of the serotonin receptors (Houlihan D. Serotonin syndrome resulting from coadministration of tramodol, venlafaxine, and mirtazapine. Ann Pharmacother 2004;38:411-413). The etiology is often the result of therapeutic drug use, intentional overdosing of serotonergic agents or complex interactions between drugs that directly or indirectly modulate the serotonin system (Boyer EW, Shannon M. The serotonin syndrome. N Engl J Med 2005;352:1112-1120). Due to the increasing availability of agents with serotonergic activity, physicians need to more aware of serotonin syndrome. The following case highlights the complex nature in which serotonin syndrome can arise, as well as the proper recognition and treatment of a potentially life-threatening yet easily avoidable condition. PMID:18433663
The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...
Neuroendocrine Tumor: Statistics
... Statistics Request Permissions Print to PDF Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 04/ ... nodes or distant parts of the body. Survival statistics should be interpreted with caution. These estimates are ...
Making large amounts of meteorological plots easily accessible to users
NASA Astrophysics Data System (ADS)
Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin
2015-04-01
The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member organisations with forecasts in the medium time range of 3 to 15 days, and some longer-range forecasts for up to a year ahead, with varying degrees of detail. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast, where some specific processing and visualisation are applied to extract information. Every day, thousands of raw data are being pushed to the ECMWF's interactive web charts application called ecCharts, and thousands of products are processed and pushed to ECMWF's institutional web site ecCharts provides a highly interactive application to display and manipulate recent numerical forecasts to forecasters in national weather services and ECMWF's commercial customers. With ecCharts forecasters are able to explore ECMWF's medium-range forecasts in far greater detail than has previously been possible on the web, and this as soon as the forecast becomes available. All ecCharts's products are also available through a machine-to-machine web map service based on the OGC Web Map Service (WMS) standard. ECMWF institutional web site provides access to a large number of graphical products. It was entirely redesigned last year. It now shares the same infrastructure as ECMWF's ecCharts, and can benefit of some ecCharts functionalities, for example the dashboard. The dashboard initially developed for ecCharts allows users to organise their own collection of products depending on their work flow, and is being further developed. In its first implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.
Localized Smart-Interpretation
NASA Astrophysics Data System (ADS)
Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom
2014-05-01
The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f(d,m) successfully has been inferred, we are able to simulate how the geological expert would perform an interpretation given some external information m, through f(d|m). We will demonstrate this method applied on geological interpretation and densely sampled airborne electromagnetic data. In short, our goal is to build a statistical model describing how a geological expert performs geological interpretation given some geophysical data. We then wish to use this statistical model to perform semi automatic interpretation, everywhere where such geophysical data exist, in a manner consistent with the choices made by a geological expert. Benefits of such a statistical model are that 1. it provides a quantification of how a geological expert performs interpretation based on available diverse data 2. all available geophysical information can be used 3. it allows much faster interpretation of large data sets.
people would buy the drug. Assume X = Binom(n; #18; 2 ). Then f(#18; 2 j x) / #18; n x #19; #18; x 2 (1 of statistical knowledge. Example (Berger): A drug company is deciding whether or not to market a new pain reliever. Two important factors: 1. Proportion of people #18; 1 for whom the drug will be effective 2
CAinterprTools: An R package to help interpreting Correspondence Analysis' results
NASA Astrophysics Data System (ADS)
Alberti, Gianmarco
2015-09-01
Correspondence Analysis (CA) is a statistical exploratory technique frequently used in many research fields to graphically visualize the structure of contingency tables. Many programs, both commercial and free, perform CA but none of them as yet provides a visual aid to the interpretation of the results. The 'CAinterprTools' package, designed to be used in the free R statistical environment, aims at filling that gap. A novel-to-medium R user has been considered as target. 15 commands enable to easily obtain charts that help (and are relevant to) the interpretation of the CA's results, freeing the user from the need to inspect and scrutinize tabular CA outputs, and to look up values and statistics on which further calculations are necessary. The package also implements tests to assess the significance of the input table's total inertia and individual dimensions.
Nurzy?ska, Katarzyna; Booth, Jonathan; Roberts, Clive J; McCabe, James; Dryden, Ian; Fischer, Peter M
2015-09-01
The purpose of this study was to develop a predictive model of the amorphous stability of drugs with particular relevance for poorly water-soluble compounds. Twenty-five representative neutral poorly soluble compounds with a diverse range of physicochemical properties and chemical structures were systematically selected from an extensive library of marketed drug products. The physical stability of the amorphous form, measured over a 6 month period by the onset of crystallization of amorphous films prepared by melting and quench-cooling, was assessed using polarized light microscopy. The data were used as a response variable in a statistical model with calculated/predicted or measured molecular, thermodynamic, and kinetic parameters as explanatory variables. Several multiple linear regression models were derived, with varying balance between calculated/predicted and measured parameters. It was shown that inclusion of measured parameters significantly improves the predictive ability of the model. The best model demonstrated a prediction accuracy of 82% and included the following as parameters: melting and glass transition temperatures, enthalpy of fusion, configurational free energy, relaxation time, number of hydrogen bond donors, lipophilicity, and the ratio of carbon to heteroatoms. Good predictions were also obtained with a simpler model, which was comprised of easily acquired quantities: molecular weight and enthalpy of fusion. Statistical models are proposed to predict long-term amorphous drug stability. The models include readily accessible parameters, which are potentially the key factors influencing amorphous stability. The derived models can support faster decision making in drug formulation development. PMID:26236939
2013-01-01
Background Spin in the reporting of randomized controlled trials, where authors report research in a way that potentially misrepresents results and mislead readers, has been demonstrated in the broader medical literature. We investigated spin in wound care trials with (a) no statistically significant result for the primary outcome and (b) no clearly specified primary outcome. Methods We searched the Cochrane Wounds Group Specialised Register of Trials for randomized controlled trials (RCTs). Eligible studies were: Parallel-group RCTs of interventions for foot, leg or pressure ulcers published in 2004 to 2009 (inclusive) with either a clearly identified primary outcome for which there was a statistically non-significant result (Cohort A) or studies that had no clear primary outcome (Cohort B). We extracted general study details. For both Cohorts A and B we then assessed for the presence of spin. For Cohort A we used a pre-defined process to assess reports for spin. For Cohort B we aimed to assess spin by recording the number of positive treatment effect claims made. We also compared the number of statistically significant and non-significant results reported in the main text and the abstract looking specifically for spin in the form of selective outcome reporting. Results Of the 71 eligible studies, 28 were eligible for Cohort A; of these, 71% (20/28) contained spin. Cohort B contained 43 studies; of these, 86% (37/43) had abstracts that claimed a favorable treatment claim. Whilst 74% (32/43) of main text results in Cohort B included at least one statistically non-significant result, this was not reflected in the abstract where only 28% contained (12/43) at least one statistically non-significant result. Conclusions Spin is a frequent phenomenon in reports of RCTs of wound treatments. Studies without statistically significant results for the primary outcome used spin in 71% of cases. Furthermore, 33% (43/132) of reports of wound RCTs did not specify a primary outcome and there was evidence of spin and selective outcome reporting in the abstracts of these. Readers should be wary of only reading the abstracts of reports of RCTs of wound treatments since they are frequently misleading regarding treatment effects. PMID:24195770
ERIC Educational Resources Information Center
DeHaan, Frank, Ed.
1977-01-01
Describes an interpretative experiment involving the application of symmetry and temperature-dependent proton and fluorine nmr spectroscopy to the solution of structural and kinetic problems in coordination chemistry. (MLH)
SLAR image interpretation keys for geographic analysis
NASA Technical Reports Server (NTRS)
Coiner, J. C.
1972-01-01
A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.
Interpretations of Entanglement
NASA Astrophysics Data System (ADS)
Jones, Martin
2002-04-01
The peculiar statistical correlations between spatially separated systems which arise in quantum mechanics, and which the Einstein-Podolsky-Rosen paper of 1935 thrust into the limelight, have been the focus of much interpretive speculation and disagreement in the years since then. Amongst the questions raised along the way have been questions about the possibility of superluminal causation, the limits of quantum mechanics and its relation to relativity theory, the nature of and need for causal explanation, realism, determinism, and the presence of holism in quantum mechanics. This talk will provide an historically structured overview of these debates including discussion of the Bohm theory, the many worlds interpretation, and more recent developments and will suggest a way of dividing many of the interpretations of entanglement into clusters of like-minded views.
Cousot, Patrick
= # = let p = p 1 u p 2 u ? F in hp, piI Å¸ = . should be: The calculational design of the abstract equalityThe Calculational Design of a Generic Abstract Interpreter Corrigendum, February 7, 1999 Patrick 1 , - F (q 2 ), p) in (r 1 , - F (r 2 )) . Section 10.3, page 34 The calculational design
NASA Astrophysics Data System (ADS)
Tema, E.; Zanella, E.; Pavón-Carrasco, F. J.; Kondopoulou, D.; Pavlides, S.
2015-10-01
We present the results of palaeomagnetic analysis on Late Bronge Age pottery from Santorini carried out in order to estimate the thermal effect of the Minoan eruption on the pre-Minoan habitation level. A total of 170 specimens from 108 ceramic fragments have been studied. The ceramics were collected from the surface of the pre-Minoan palaeosol at six different sites, including also samples from the Akrotiri archaeological site. The deposition temperatures of the first pyroclastic products have been estimated by the maximum overlap of the re-heating temperature intervals given by the individual fragments at site level. A new statistical elaboration of the temperature data has also been proposed, calculating at 95 per cent of probability the re-heating temperatures at each site. The obtained results show that the precursor tephra layer and the first pumice fall of the eruption were hot enough to re-heat the underlying ceramics at temperatures 160-230 °C in the non-inhabited sites while the temperatures recorded inside the Akrotiri village are slightly lower, varying from 130 to 200 °C. The decrease of the temperatures registered in the human settlements suggests that there was some interaction between the buildings and the pumice fallout deposits while probably the buildings debris layer caused by the preceding and syn-eruption earthquakes has also contributed to the decrease of the recorded re-heating temperatures.
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Linda Stetzenbach; Lauren Nemnich; Davor Novosel
2009-08-31
Three independent tasks had been performed (Stetzenbach 2008, Stetzenbach 2008b, Stetzenbach 2009) to measure a variety of parameters in normative buildings across the United States. For each of these tasks 10 buildings were selected as normative indoor environments. Task 1 focused on office buildings, Task 13 focused on public schools, and Task 0606 focused on high performance buildings. To perform this task it was necessary to restructure the database for the Indoor Environmental Quality (IEQ) data and the Sound measurement as several issues were identified and resolved prior to and during the transfer of these data sets into SPSS. During overview discussions with the statistician utilized in this task it was determined that because the selection of indoor zones (1-6) was independently selected within each task; zones were not related by location across tasks. Therefore, no comparison would be valid across zones for the 30 buildings so the by location (zone) data were limited to three analysis sets of the buildings within each task. In addition, differences in collection procedures for lighting were used in Task 0606 as compared to Tasks 01 & 13 to improve sample collection. Therefore, these data sets could not be merged and compared so effects by-day data were run separately for Task 0606 and only Task 01 & 13 data were merged. Results of the statistical analysis of the IEQ parameters show statistically significant differences were found among days and zones for all tasks, although no differences were found by-day for Draft Rate data from Task 0606 (p>0.05). Thursday measurements of IEQ parameters were significantly different from Tuesday, and most Wednesday measures for all variables of Tasks 1 & 13. Data for all three days appeared to vary for Operative Temperature, whereas only Tuesday and Thursday differed for Draft Rate 1m. Although no Draft Rate measures within Task 0606 were found to significantly differ by-day, Temperature measurements for Tuesday and Thursday showed variation. Moreover, Wednesday measurements of Relative Humidity within Task 0606 varied significantly from either Tuesday or Thursday. The majority of differences in IEQ measurements by-zone were highly significant (p<0.001), with the exception of Relative Humidity in some buildings. When all task data were combined (30 buildings) neither the airborne culturable fungi nor the airborne non-culturable spore data differed in the concentrations found at any indoor location in terms of day of collection. However, the concentrations of surface-associated fungi varied among the day of collection. Specifically, there was a lower concentration of mold on Tuesday than on Wednesday, for all tasks combined. As expected, variation was found in the concentrations of both airborne culturable fungi and airborne non-culturable fungal spores between indoor zones (1-6) and the outdoor zone (zone 0). No variation was found among the indoor zones of office buildings for Task 1 in the concentrations of airborne culturable fungi. However, airborne non-culturable spores did vary among zones in one building in Task 1 and variation was noted between zones in surface-associated fungi. Due to the lack of multiple lighting measurements for Tasks 13 and 0606, by-day comparisons were only performed for Task 1. No statistical differences were observed in lighting with respect to the day of collection. There was a wide range of variability by-zone among seven of the office buildings. Although few differences were found for the brightest illumination of the worksurface (IllumWkSfcBrtst) and the darkest illumination of the worksurface (IllumWkSfcDrkst) in Task 1, there was considerable variation for these variables in Task 13 and Task 0606 (p < 0.001). Other variables that differed by-zone in Task 13 include CombCCT and AmbCCT1 for S03, S07, and S08. Additionally, AmbChromX1, CombChromY, and CombChromX varied by-zone for school buildings S02, S04, and S05, respectively. Although all tasks demonstrated significant differences in sound measurements by zone, some of the buil
A. R. P. Rau
2006-06-03
Difficulties and discomfort with the interpretation of quantum mechanics are due to differences in language between it and classical physics. Analogies to The Special Theory of Relativity, which also required changes in the basic worldview and language of non-relativistic classical mechanics, may help in absorbing the changes called for by quantum physics. There is no need to invoke extravagances such as the many worlds interpretation or specify a central role for consciousness or neural microstructures. The simple, but basic, acceptance that what is meant by the state of a physical system is different in quantum physics from what it is in classical physics goes a long way in explaining its seeming peculiarities.
Reeve, Joanne
2010-01-01
Patient-centredness is a core value of general practice; it is defined as the interpersonal processes that support the holistic care of individuals. To date, efforts to demonstrate their relationship to patient outcomes have been disappointing, whilst some studies suggest values may be more rhetoric than reality. Contextual issues influence the quality of patient-centred consultations, impacting on outcomes. The legitimate use of knowledge, or evidence, is a defining aspect of modern practice, and has implications for patient-centredness. Based on a critical review of the literature, on my own empirical research, and on reflections from my clinical practice, I critique current models of the use of knowledge in supporting individualised care. Evidence-Based Medicine (EBM), and its implementation within health policy as Scientific Bureaucratic Medicine (SBM), define best evidence in terms of an epistemological emphasis on scientific knowledge over clinical experience. It provides objective knowledge of disease, including quantitative estimates of the certainty of that knowledge. Whilst arguably appropriate for secondary care, involving episodic care of selected populations referred in for specialist diagnosis and treatment of disease, application to general practice can be questioned given the complex, dynamic and uncertain nature of much of the illness that is treated. I propose that general practice is better described by a model of Interpretive Medicine (IM): the critical, thoughtful, professional use of an appropriate range of knowledges in the dynamic, shared exploration and interpretation of individual illness experience, in order to support the creative capacity of individuals in maintaining their daily lives. Whilst the generation of interpreted knowledge is an essential part of daily general practice, the profession does not have an adequate framework by which this activity can be externally judged to have been done well. Drawing on theory related to the recognition of quality in interpretation and knowledge generation within the qualitative research field, I propose a framework by which to evaluate the quality of knowledge generated within generalist, interpretive clinical practice. I describe three priorities for research in developing this model further, which will strengthen and preserve core elements of the discipline of general practice, and thus promote and support the health needs of the public. PMID:21805819
NASA Astrophysics Data System (ADS)
Xu, C.; Shyu, J. B. H.; Xu, X.
2014-07-01
The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw= 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and the thicknesses of their erosion with topographic, geologic, and seismic parameters. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolution satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various environmental parameters. These parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several previous publications. Therefore, we carried out comparisons of inventories of landslides triggered by the Haiti earthquake with other published results and proposed possible reasons for any differences. We suggest that the empirical functions between earthquake magnitude and co-seismic landslides need to be updated on the basis of the abundant and more complete co-seismic landslide inventories recently available.
NASA Astrophysics Data System (ADS)
Xu, C.; Shyu, J. B. H.; Xu, X.-W.
2014-02-01
The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and their erosion thicknesses with topographic factors, seismic parameters, and their distance from roads. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolutions satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various landslide controlling parameters. These controlling parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several previous publications. Therefore, we carried out comparisons of inventories of landslides triggered by the Haiti earthquake with other published results and proposed possible reasons of any differences. We suggest that the empirical functions between earthquake magnitude and co-seismic landslides need to update on the basis of the abundant and more complete co-seismic landslide inventories recently available.
Statistical Interpretation of LMC Microlensing Candidates
Sohrab Rahvar
2003-11-01
After a decade of gravitational microlensing experiments, a dozen of microlensing candidates in the direction of the stars of the Large Magellanic Cloud (LMC) have been detected by the EROS and MACHO groups. Recently it was shown that the distribution of the duration of the observed LMC microlensing events is significantly narrower than what is expected from the standard halo model. In this article we make the same comparison, using non-standard halo models and considering the contribution of non-halo components of the Milky Way such as the disc, spheroid and LMC itself in the microlensing events. Comparing the theoretical and experimental widths of distributions of the duration of events shows that neither standard nor non-standard halo models are compatible with the microlensing data at least with 95 per cent of confidence. This results maybe explained if (i) the MACHOs in the Galactic halo reside in clumpy structures or (ii) the durations of events have been underestimated due to the blending effect.
(Draft Copy) On the Statistical Interpretation of
California at Los Angeles, University of
the economist's understanding of household and producer behavior. In a reply to Goldberger, Nanny Wermuth [23 shock, u 2 = supply shock. The first equation states that household demand depends on price and household income (which are observable) and an unobserved factor. The systematic part of this equation, a 1
INTERPRETING INDICATORS OF RANGELAND HEALTH, VERSION 4
Technology Transfer Automated Retrieval System (TEKTRAN)
Land managers are in need of an assessment tool that provides a preliminary evaluation of rangeland health. Interpreting Indicators of Rangeland Health, Version 4 is the second published version of a protocol that uses 17 easily observed indicators summarized as three rangeland health attributes (s...
ERIC Educational Resources Information Center
Markus, Keith A.
2008-01-01
One can distinguish statistical models used in causal modeling from the causal interpretations that align them with substantive hypotheses. Causal modeling typically assumes an efficient causal interpretation of the statistical model. Causal modeling can also make use of mereological causal interpretations in which the state of the parts…
Nicolotti, Orazio; Gillet, Valerie J; Fleming, Peter J; Green, Darren V S
2002-11-01
Deriving quantitative structure-activity relationship (QSAR) models that are accurate, reliable, and easily interpretable is a difficult task. In this study, two new methods have been developed that aim to find useful QSAR models that represent an appropriate balance between model accuracy and complexity. Both methods are based on genetic programming (GP). The first method, referred to as genetic QSAR (or GPQSAR), uses a penalty function to control model complexity. GPQSAR is designed to derive a single linear model that represents an appropriate balance between the variance and the number of descriptors selected for the model. The second method, referred to as multiobjective genetic QSAR (MoQSAR), is based on multiobjective GP and represents a new way of thinking of QSAR. Specifically, QSAR is considered as a multiobjective optimization problem that comprises a number of competitive objectives. Typical objectives include model fitting, the total number of terms, and the occurrence of nonlinear terms. MoQSAR results in a family of equivalent QSAR models where each QSAR represents a different tradeoff in the objectives. A practical consideration often overlooked in QSAR studies is the need for the model to promote an understanding of the biochemical response under investigation. To accomplish this, chemically intuitive descriptors are needed but do not always give rise to statistically robust models. This problem is addressed by the addition of a further objective, called chemical desirability, that aims to reward models that consist of descriptors that are easily interpretable by chemists. GPQSAR and MoQSAR have been tested on various data sets including the Selwood data set and two different solubility data sets. The study demonstrates that the MoQSAR method is able to find models that are at least as good as models derived using standard statistical approaches and also yields models that allow a medicinal chemist to trade statistical robustness for chemical interpretability. PMID:12408718
Summary and interpretive synthesis
1995-05-01
This chapter summarizes the major advances made through our integrated geological studies of the Lisburne Group in northern Alaska. The depositional history of the Lisburne Group is discussed in a framework of depositional sequence stratigraphy. Although individual parasequences (small-scale carbonate cycles) of the Wahoo Limestone cannot be correlated with certainty, parasequence sets can be interpreted as different systems tracts within the large-scale depositional sequences, providing insights on the paleoenvironments, paleogeography and platform geometry. Conodont biostratigraphy precisely established the position of the Mississippian-Pennsylvanian boundary within an important reference section, where established foraminiferal biostratigraphy is inconsistent with respect to conodont-based time-rock boundaries. However, existing Carboniferous conodont zonations are not readily applicable because most zonal indicators are absent, so a local zonation scheme was developed. Diagenetic studies of the Lisburne Group recognized nineteen subaerial exposure surfaces and developed a cement stratigraphy that includes: early cements associated with subaerial exposure surfaces in the Lisburne Group; cements associated with the sub-Permian unconformity; and later burial cements. Subaerial exposure surfaces in the Alapah Limestone are easily explained, being associated with peritidal environments at the boundaries of Sequence A. The Lisburne exposed in ANWR is generally tightly cemented and supermature, but could still be a good reservoir target in the adjacent subsurface of ANWR given the appropriate diagenetic, deformational and thermal history. Our ongoing research on the Lisburne Group will hopefully provide additional insights in future publications.
Tanner, David B.
Electrochromic Polymers for Easily Processed Devices John R. Reynolds,* Avni A. Argun, Irina for electrochromic applications. These polymers exhibit ease of processability and useful mechanical properties (e.g. flexibility). However, the major strength of these organic-based materials is that their electrochromic
ERIC Educational Resources Information Center
Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.
2012-01-01
The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET…
Under Review -(c) IEEE 2012 Inexpensive and Easily Customized Tactile Array
be easily encapsulated with soft polymers to provide robust and compliant grasping surfaces for specific to make sensitive, robust, and inexpensive tactile sensing available for a wide range of robotics contacts on the robot hand and contact pressure distribution are believed to be essential for effective
Design to read: designing for people who do not read easily
Caroline Jarrett; Helen Petrie; Kathryn Summers
2010-01-01
Many people do not read easily. They may have an impairment such as a visual problem. They may be reading in stressful conditions or poor light, or perhaps they are reading in a second language. Is it possible to provide one consistent set of guidelines or approaches that will allow designers of electronic materials to meet all the apparently diverse
Common pitfalls in statistical analysis: Clinical versus statistical significance.
Ranganathan, Priya; Pramesh, C S; Buyse, Marc
2015-01-01
In clinical research, study results, which are statistically significant are often interpreted as being clinically important. While statistical significance indicates the reliability of the study results, clinical significance reflects its impact on clinical practice. The third article in this series exploring pitfalls in statistical analysis clarifies the importance of differentiating between statistical significance and clinical significance. PMID:26229754
A Graphical Interpretation of Probit Coefficients.
ERIC Educational Resources Information Center
Becker, William E.; Waldman, Donald M.
1989-01-01
Contends that, when discrete choice models are taught, particularly the probit model, it is the method rather than the interpretation of the results that is emphasized. This article provides a graphical technique for interpretation of an estimated probit coefficient that will be useful in statistics and econometrics courses. (GG)
Interpreters, Interpreting, and the Study of Bilingualism.
ERIC Educational Resources Information Center
Valdes, Guadalupe; Angelelli, Claudia
2003-01-01
Discusses research on interpreting focused specifically on issues raised by this literature about the nature of bilingualism. Suggests research carried out on interpreting--while primarily produced with a professional audience in mind and concerned with improving the practice of interpreting--provides valuable insights about complex aspects of…
Targeting Lexicon in Interpreting.
ERIC Educational Resources Information Center
Farghal, Mohammed; Shakir, Abdullah
1994-01-01
Studies student interpreters in the Master's Translation Program at Yarmouk University in Jordan. Analyzes the difficulties of these students, particularly regarding lexical competence, when interpreting from Arabic to English, emphasizing the need to teach lexicon all through interpreting programs. (HB)
Strangeness and statistical QCD
Rafelski, Johann; Rafelski, Johann; Letessier, Jean
2002-01-01
We discuss properties of statistical QCD relevant in Fermi phase space model analysis of strange hadron production experimental data. We argue that the analysis results interpreted using established statistical QCD properties are demonstrating formation of the color deconfined state of matter in relativistic heavy ion collisions at highest CERN-SPS energies and at BNL-RHIC, comprising deconfined matter composed of nearly massless quarks and gluons, in statistical equilibrium.
Revisiting the statistical analysis of pyroclast density and porosity data
NASA Astrophysics Data System (ADS)
Bernard, B.; Kueppers, U.; Ortiz, H.
2015-07-01
Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data to statistical methods and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any data set easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a data set is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology, we chose two large data sets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose the incorporation of this analysis into future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.
LED champing: statistically blessed?
Wang, Zhuo
2015-06-10
LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions. PMID:26192863
NASA Astrophysics Data System (ADS)
Grimes, Holly; McMenemy, Karen R.; Ferguson, R. S.
2008-02-01
This paper details how simple PC software, a small network of consumer level PCs, some do-it-yourself hardware and four low cost video projectors can be combined to form an easily configurable and transportable projection display with applications in virtual reality training. This paper provides some observations on the practical difficulties of using such a system, its effectiveness in delivering a VE for training and what benefit may be offered through the deployment of a large number of these low cost environments.
An easily removable stereo-dictating group for enantioselective synthesis of propargylic amines.
Fan, Wu; Ma, Shengming
2013-10-01
We report herein a CuBr-catalyzed three-component coupling of 2-methylbut-3-yn-2-ol, aldehydes and pyrrolidine or 1,2,3,4-tetrahydroisoquinoline leading to the corresponding chiral propargylamines in excellent enantiomeric excess (91 to >99% ee) and high yields (79-95% yield). The dimethylcarbinol unit in 2-methylbut-3-yn-2-ol, which may be easily removed at the later stage to regenerate a terminal alkyne unit for further elaboration, plays a very important role in ensuring high enantioselectivity. This protocol provides easy and very general access to different terminal and non-terminal tertiary propargylic amines. PMID:24051867
NASA Astrophysics Data System (ADS)
Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.
2012-02-01
The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET project team overcame this challenge by creating the Translation Utility. This tool allows a person fluent in both English and another language to easily translate any of the PhET simulations and requires minimal computer expertise. In this paper we discuss the technical issues involved in this software solution, as well as the issues involved in obtaining accurate translations. We share our solutions to many of the unexpected problems we encountered that would apply generally to making on-line scientific course materials available in many different languages, including working with: languages written right-to-left, different character sets, and different conventions for expressing equations, variables, units and scientific notation.
The Copenhagen Interpretation Born Again
Timothy J. Hollowood
2015-01-05
An approach to quantum mechanics is developed which makes the Heisenberg cut between the deterministic microscopic quantum world and the partly deterministic, partly stochastic macroscopic world explicit. The microscopic system evolves according to the Schrodinger equation with stochastic behaviour arising when the system is probed by a set of coarse grained macroscopic observables whose resolution scale defines the Heisenberg cut. The resulting stochastic process can account for the different facets of the classical limit: Newton's laws (ergodicity broken); statistical mechanics of thermal ensembles (ergodic); and solve the measurement problem (partial ergodicity breaking). In particular, the usual rules of the Copenhagen interpretation, like the Born rule, emerge, along with completely local descriptions of EPR type experiments. The formalism also re-introduces a dynamical picture of equilibration and thermalization in quantum statistical mechanics and provides insight into how classical statistical mechanics can arise in the classical limit and in a way that alleviates various conceptual problems.
Statistical Applets: Statistical Significance
NSDL National Science Digital Library
Duckworth, William
Created by authors Duckworth, McCabe, Moore and Sclove for W.H. Freeman of Co., this applet is designed to help students visualize the rejection region of a statistical test by allowing them to set null and alternate hypotheses, population parameters, sample statistics, and significance level. It accompanies "Â?Â?Practice of Business Statistics," but can be used without this text. Even though brief, this is a nice interactive resource for an introductory statistics course.
Enhancing Table Interpretation Skills via Training in Table Creation
ERIC Educational Resources Information Center
Karazsia, Bryan T.
2013-01-01
Quantitative and statistical literacy are core domains in the undergraduate psychology curriculum. An important component of such literacy includes interpretation of visual aids, such as tables containing results from statistical analyses. This article presents a new technique for enhancing student interpretation of American Psychological…
Statistical laws in linguistics
Altmann, Eduardo G
2015-01-01
Zipf's law is just one out of many universal laws proposed to describe statistical regularities in language. Here we review and critically discuss how these laws can be statistically interpreted, fitted, and tested (falsified). The modern availability of large databases of written text allows for tests with an unprecedent statistical accuracy and also a characterization of the fluctuations around the typical behavior. We find that fluctuations are usually much larger than expected based on simplifying statistical assumptions (e.g., independence and lack of correlations between observations).These simplifications appear also in usual statistical tests so that the large fluctuations can be erroneously interpreted as a falsification of the law. Instead, here we argue that linguistic laws are only meaningful (falsifiable) if accompanied by a model for which the fluctuations can be computed (e.g., a generative model of the text). The large fluctuations we report show that the constraints imposed by linguistic laws...
GoCxx: a tool to easily leverage C++ legacy code for multicore-friendly Go libraries and frameworks
NASA Astrophysics Data System (ADS)
Binet, Sébastien
2012-12-01
Current HENP libraries and frameworks were written before multicore systems became widely deployed and used. From this environment, a ‘single-thread’ processing model naturally emerged but the implicit assumptions it encouraged are greatly impairing our abilities to scale in a multicore/manycore world. Writing scalable code in C++ for multicore architectures, while doable, is no panacea. Sure, C++11 will improve on the current situation (by standardizing on std::thread, introducing lambda functions and defining a memory model) but it will do so at the price of complicating further an already quite sophisticated language. This level of sophistication has probably already strongly motivated analysis groups to migrate to CPython, hoping for its current limitations with respect to multicore scalability to be either lifted (Grand Interpreter Lock removal) or for the advent of a new Python VM better tailored for this kind of environment (PyPy, Jython, …) Could HENP migrate to a language with none of the deficiencies of C++ (build time, deployment, low level tools for concurrency) and with the fast turn-around time, simplicity and ease of coding of Python? This paper will try to make the case for Go - a young open source language with built-in facilities to easily express and expose concurrency - being such a language. We introduce GoCxx, a tool leveraging gcc-xml's output to automatize the tedious work of creating Go wrappers for foreign languages, a critical task for any language wishing to leverage legacy and field-tested code. We will conclude with the first results of applying GoCxx to real C++ code.
Frontal sinus mucoceles presenting in the upper eyelid: an easily missed diagnosis
Ch'ng, Soon Wai; Pillai, Manju Bhaskaran; Morton, Claire
2012-01-01
Frontal sinus mucoceles are epithelium-lined mucus-containing sacs that are capable of bony expansion causing a spectrum of ophthalmological symptoms. If left untreated, they can erode the thin sinus wall causing life-threatening complications such as meningitis. We would like to alert the clinicians to this diagnosis that can be easily misdiagnosed. The first patient appeared to have an allergic blepharo-conjunctivitis that was not responding to topical and systemic medications. The second patient presented with recurrent preseptal cellulitis unresponsive to oral antibiotics. CT imaging of both patients revealed frontal sinus mucoceles. Both patients recovered well with sinus surgery. Management of these patients needed a close liaison with our ENT and radiology colleagues to warrant a good outcome. PMID:22675146
A new side stream process for easily degradable industrial waste waters to avoid sludge bulking.
Wandl, G; Matsché, N; Bayer, H
2004-01-01
A new treatment scheme for the treatment of easily biodegradable industrial waste waters has been developed. The side stream treatment of dairy waste water with the excess sludge from the domestic treatment line of the regional treatment plant Bad Vöslau has been operated successfully for a period of three years during which the industrial load stemming from the dairy increased from 800 kg COD/d to 2,500 kg COD/d with peak loads up to 5,000 kg/d. Despite of the increased load to the treatment plant the total aeration tank volume had not been increased. This treatment is performed in an existing aeration tank of the WWTP (V = 1,800 m3) which is now used as contact tank for the combined aeration of dairy waste water and excess sludge from the domestic treatment line (volume aeration tank = 15,000 m3). In this tank the easily degradable substrate from the industrial waste is mainly adsorbed to the biological sludge and after a mechanical dewatering transferred to the anaerobic digester where it yields in an increased gas production. The filtrate of the dewatering process is completely free from biodegradable material and can without danger of bulking be fed to the aeration tank of the domestic treatment line. The new process has proven to be extremely flexible since already now daily peak loads exceeding the design load by more then 60% could be treated in the plant without any problems. Compared to other alternatives for the dairy waste water treatment that were investigated during this study, the new side stream process is very advantageous. No other pre-treatment process for industrial waste water could have been operated under comparable loading conditions without severe operating problems. PMID:15553480
The study on development of easily chewable and swallowable foods for elderly
Kim, Soojeong
2015-01-01
BACKGROUND/OBJECTS When the functions involved in the ingestion of food occurs failure, not only loss of enjoyment of eating, it will be faced with protein-energy malnutrition. Dysmasesis and difficulty of swallowing occurs in various diseases, but it may be a major cause of aging, and elderly people with authoring and dysmasesis and difficulty of swallowing in the aging society is expected to increase rapidly. SUBJECTS/METHODS In this study, we carried out a survey targeting nutritionists who work in elderly care facilities, and examined characteristics of offering of foods for elderly and the degree of demand of development of easily chewable and swallowable foods for the elderly who can crush foods and take that by their own tongues, and sometimes have difficulty in drinking water and tea. RESULTS In elderly care facilities, it was found to provide a finely chopped food or ground food that was ground with water in a blender for elderly with dysmasesis. Elderly satisfaction of provided foods is appeared overall low. Results of investigating the applicability of foods for elderly and the reflection will of menus, were showed the highest response rate in a gelification method in molecular gastronomic science technics, and results of investigating the frequent food of the elderly; representative menu of beef, pork, white fish, anchovies and spinach, were showed Korean barbecue beef, hot pepper paste stir fried pork, pan fried white fish, stir fried anchovy, seasoned spinach were the highest offer frequency. CONCLUSIONS This study will provide the fundamentals of the development of easily chewable and swallowable foods, gelification, for the elderly. The study will also illustrate that, in the elderly, food undergone gelification will reduce the risk of swallowing down to the wrong pipe and improve overall food preference. PMID:26244082
NASA Astrophysics Data System (ADS)
Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P.; Freitas, Vânia T.; André, Paulo S.; Carlos, Luis D.; Ferreira, Rute A. S.
2015-10-01
This manuscript reports the synthesis and characterization of the first organic–inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er3+, Yb3+ codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d–U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10–20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er3+- and Yb3+-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices.
Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P; Freitas, Vânia T; André, Paulo S; Carlos, Luis D; Ferreira, Rute A S
2015-10-01
This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er(3+), Yb(3+) codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er(3+)- and Yb(3+)-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices. PMID:26374133
BUSINESS STATISTICS (GB 311-115) SYLLABUS, FALL 2014
Diestel, Geoff
, and apply knowledge while working within a mastery based pedagogical approach (Hawkes Learning Systems, n of statistics, by creating and interpreting basic statistical graphs and charts, calculating and interpreting
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM JEROME FRIEDMAN Department of Statistics Stanford Ensemble methods have emerged as being among the most powerful statistical learning techniques. It is shown
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM GOURAB MUKHERJEE Department of Statistics Stanford directions in statistical probability forecasting. Building on these parallels we present a frequentist
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM ADRIAN RAFTERY Department of Statistics University will describe a Bayesian statistical method for probabilistic population projections for all countries
The emergent Copenhagen interpretation of quantum mechanics
NASA Astrophysics Data System (ADS)
Hollowood, Timothy J.
2014-05-01
We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR-Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems.
Comprehensive Interpretive Planning.
ERIC Educational Resources Information Center
Kohen, Richard; Sikoryak, Kim
1999-01-01
Discusses interpretive planning and provides information on how to maximize a sense of ownership shared by managers, staff, and other organizational shareholders. Presents practical and effective plans for providing interpretive services. (CCM)
Data Acquisition Interpretation
Oldenburg, Douglas W.
Data Acquisition Inversion Interpretation Discussion Virgin River DCIP Report Justin Granek1 1 Report #12;Data Acquisition Inversion Interpretation Discussion Outline 1 Data Acquisition Location Survey Specications 2 Inversion Data Errors DCIP2D DCIP3D 3 Interpretation Correlations Snowbird Tectonic
Translation and Interpretation.
ERIC Educational Resources Information Center
Nicholson, Nancy Schweda
1995-01-01
Examines recent trends in the fields of translation and interpretation, focusing on translation and interpretation theory and practice, language-specific challenges, computer-assisted translation, machine translation, subtitling, and translator and interpreter training. An annotated bibliography discusses seven important works in the field. (112…
Revisiting the statistical analysis of pyroclast density and porosity data
NASA Astrophysics Data System (ADS)
Bernard, B.; Kueppers, U.; Ortiz, H.
2015-03-01
Explosive volcanic eruptions are commonly characterized based on a thorough analysis of the generated deposits. Amongst other characteristics in physical volcanology, density and porosity of juvenile clasts are some of the most frequently used characteristics to constrain eruptive dynamics. In this study, we evaluate the sensitivity of density and porosity data and introduce a weighting parameter to correct issues raised by the use of frequency analysis. Results of textural investigation can be biased by clast selection. Using statistical tools as presented here, the meaningfulness of a conclusion can be checked for any dataset easily. This is necessary to define whether or not a sample has met the requirements for statistical relevance, i.e. whether a dataset is large enough to allow for reproducible results. Graphical statistics are used to describe density and porosity distributions, similar to those used for grain-size analysis. This approach helps with the interpretation of volcanic deposits. To illustrate this methodology we chose two large datasets: (1) directed blast deposits of the 3640-3510 BC eruption of Chachimbiro volcano (Ecuador) and (2) block-and-ash-flow deposits of the 1990-1995 eruption of Unzen volcano (Japan). We propose add the use of this analysis for future investigations to check the objectivity of results achieved by different working groups and guarantee the meaningfulness of the interpretation.
Automatic interpretation of biological tests.
Boufriche-Boufaïda, Z
1998-03-01
In this article, an approach for an Automatic Interpretation of Biological Tests (AIBT) is described. The developed system is much needed in Preventive Medicine Centers (PMCs). It is designed as a self-sufficient system that could be easily used by trained nurses during the routine visit. The results that the system provides are not only useful to provide the PMC physicians with a preliminary diagnosis, but also allows them more time to focus on the serious cases, making the clinical visit more qualitative. On the other hand, because the use of such a system has been planned for many years, its possibilities for future extensions must be seriously considered. The methodology adopted can be interpreted as a combination of the advantages of two main approaches adopted in current diagnostic systems: the production system approach and the object-oriented system approach. From the rules, the ability of these approaches to capture the deductive processes of the expert in domains where causal mechanisms are often understood are retained. The object-oriented approach guides the elicitation and the engineering of knowledge in such a way that abstractions, categorizations and classifications are encouraged whilst individual instances of objects of any type are recognized as separate, independent entities. PMID:9684093
Easily Regenerable Solid Adsorbents Based on Polyamines for Carbon Dioxide Capture from the Air
Goeppert, A; Zhang, H; Czaun, M; May, RB; Prakash, GKS; Olah, GA; Narayanan, SR
2014-03-18
Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle.
Bar-Eyal, Leeat; Eisenberg, Ido; Faust, Adam; Raanan, Hagai; Nevo, Reinat; Rappaport, Fabrice; Krieger-Liszkay, Anja; Sétif, Pierre; Thurotte, Adrien; Reich, Ziv; Kaplan, Aaron; Ohad, Itzhak; Paltiel, Yossi; Keren, Nir
2015-10-01
Biological desert sand crusts are the foundation of desert ecosystems, stabilizing the sands and allowing colonization by higher order organisms. The first colonizers of the desert sands are cyanobacteria. Facing the harsh conditions of the desert, these organisms must withstand frequent desiccation-hydration cycles, combined with high light intensities. Here, we characterize structural and functional modifications to the photosynthetic apparatus that enable a cyanobacterium, Leptolyngbya sp., to thrive under these conditions. Using multiple in vivo spectroscopic and imaging techniques, we identified two complementary mechanisms for dissipating absorbed energy in the desiccated state. The first mechanism involves the reorganization of the phycobilisome antenna system, increasing excitonic coupling between antenna components. This provides better energy dissipation in the antenna rather than directed exciton transfer to the reaction center. The second mechanism is driven by constriction of the thylakoid lumen which limits diffusion of plastocyanin to P700. The accumulation of P700(+) not only prevents light-induced charge separation but also efficiently quenches excitation energy. These protection mechanisms employ existing components of the photosynthetic apparatus, forming two distinct functional modes. Small changes in the structure of the thylakoid membranes are sufficient for quenching of all absorbed energy in the desiccated state, protecting the photosynthetic apparatus from photoinhibitory damage. These changes can be easily reversed upon rehydration, returning the system to its high photosynthetic quantum efficiency. PMID:26188375
Easily regenerable solid adsorbents based on polyamines for carbon dioxide capture from the air.
Goeppert, Alain; Zhang, Hang; Czaun, Miklos; May, Robert B; Prakash, G K Surya; Olah, George A; Narayanan, S R
2014-05-01
Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle. PMID:24644023
Open Window: When Easily Identifiable Genomes and Traits Are in the Public Domain
Angrist, Misha
2014-01-01
“One can't be of an enquiring and experimental nature, and still be very sensible.” - Charles Fort [1] As the costs of personal genetic testing “self-quantification” fall, publicly accessible databases housing people's genotypic and phenotypic information are gradually increasing in number and scope. The latest entrant is openSNP, which allows participants to upload their personal genetic/genomic and self-reported phenotypic data. I believe the emergence of such open repositories of human biological data is a natural reflection of inquisitive and digitally literate people's desires to make genomic and phenotypic information more easily available to a community beyond the research establishment. Such unfettered databases hold the promise of contributing mightily to science, science education and medicine. That said, in an age of increasingly widespread governmental and corporate surveillance, we would do well to be mindful that genomic DNA is uniquely identifying. Participants in open biological databases are engaged in a real-time experiment whose outcome is unknown. PMID:24647311
Cholesteryl ester storage disease: an easily missed diagnosis in oligosymptomatic children.
Freudenberg, F; Bufler, P; Ensenauer, R; Lohse, P; Koletzko, S
2013-10-01
Cholesteryl ester storage disease (CESD) is a rare, autosomal recessively inherited disorder resulting from deficient activity of lysosomal acid lipase (LAL). LAL is the key enzyme hydrolyzing cholesteryl esters and triglycerides stored in lysosomes after LDL receptor-mediated endocytosis. Mutations within the LIPA gene locus on chromosome 10q23.2-q23.3 may result either in the always fatal Wolman disease, where no LAL activity is found, or in the more benign disorder CESD with a reduced enzymatic activity, leading to massive accumulation of cholesteryl esters and triglycerides in many body tissues. CESD affects mostly the liver, the spectrum is ranging from isolated hepatomegaly to liver cirrhosis. Chronic diarrhea has been reported in some pediatric cases, while calcifications of the adrenal glands, the hallmark of Wolman disease, are rarely observed. Hypercholesterolemia and premature atherosclerosis are other typical disease manifestations. Hepatomegaly as a key finding has been reported in all 71 pediatric patients and in 134 of 135 adult cases in the literature. We present a 13-year-old boy with mildly elevated liver enzymes in the absence of hepatomegaly, finally diagnosed with CESD. Under pravastatine treatment, the patient has normal laboratory findings and is clinically unremarkable since 5 years of follow-up. To our knowledge, this is the first pediatric case of genetically and biopsy confirmed CESD without hepatomegaly, suggesting that this diagnosis can be easily missed. It further raises the question about the natural course and the therapy required for this oligosymptomatic form. PMID:24122380
Easily aligned deformable-helix ferroelectric liquid crystal mixture and its use in devices
NASA Astrophysics Data System (ADS)
Wand, Michael D.; Vohra, Rohini T.; O'Callaghan, Michael J.; Roberts, Beth; Escher, Claus
1992-06-01
Ferroelectric smectic C* liquid crystals have been shown to exhibit high speed, multistate electro-optic and display device applications, particularly when incorporated into the surface stabilized ferroelectric liquid crystal (SSFLC) light valve. The SSFLC geometry results in two distinct stable states. Unfortunately, the lack of intermediate electrically addressed states precludes a natural gray-scale effect. The recently discovered Deformed Helix Ferroelectric liquid crystal (DHFLC) effect opens the door to linear gray scale or linear phase modulation in a ferroelectric liquid crystal device on a microsecond time-scale. One drawback of currently available DHFLC materials is that their alignment quality is limited due to the lack of a nematic phase above their smectic A phase. While alignment can be improved by the use of shear techniques, this represents an undesirable option for a manufacturing process. We show that DHFLC mixtures can possess a nematic phase with a long N* pitch and tight C* pitch in the C* phase. These new easily aligned DHFLC mixtures are discussed as well as their use in beam-steering devices that can benefit from analog optical response.
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM DAVID BLEI Department of Statistics and Computer posterior inference algorithms have revolutionized Bayesian statistics, revealing its potential as a usable and general-purpose language for data analysis. Bayesian statistics, however, has not yet reached
Interpreting Abstract Interpretations in Membership Equational Logic
NASA Technical Reports Server (NTRS)
Fischer, Bernd; Rosu, Grigore
2001-01-01
We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.
OntologyWidget – a reusable, embeddable widget for easily locating ontology terms
Beauheim, Catherine C; Wymore, Farrell; Nitzberg, Michael; Zachariah, Zachariah K; Jin, Heng; Skene, JH Pate; Ball, Catherine A; Sherlock, Gavin
2007-01-01
Background Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. Results We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML) and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD), which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO) website [1]. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1) install Apache Tomcat [2] on one's web server, (2) download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3) create an html (HyperText Markup Language) file that refers to the OntologyWidget using a simple, well-defined format. Conclusion We have developed OntologyWidget, an easy-to-use ontology search and display tool that can be used on any web page by creating a simple html description. OntologyWidget provides a rapid auto-complete search function paired with an interactive tree display. We have developed a web service layer that communicates between the web page interface and a database of ontology terms. We currently store 40 of the ontologies from the OBO website [1], as well as a several others. These ontologies are automatically updated on a weekly basis. OntologyWidget can be used in any web-based application to take advantage of the ontologies we provide via web services or any other ontology that is provided elsewhere in the correct format. The full source code for the JavaScript and description of the OntologyWidget is available from . PMID:17854506
Rangel, Thomaz C; Michels, Alexandre F; Horowitz, Flávio; Weibel, Daniel E
2015-03-24
Textures that resemble typical fern or bracken plant species (dendrite structures) were fabricated for liquid repellency by dipping copper substrates in a single-step process in solutions containing AgNO3 or by a simple spray liquid application. Superhydrophobic surfaces were produced using a solution containing AgNO3 and trimethoxypropylsilane (TMPSi), and superomniphobic surfaces were produced by a two-step procedure, immersing the copper substrate in a AgNO3 solution and, after that, in a solution containing 1H,1H,2H,2H-perfluorodecyltriethoxysilane (PFDTES). The simple functionalization processes can also be used when the superomniphobic surfaces were destroyed by mechanical stress. By immersion of the wrecked surfaces in the above solutions or by the spray method and soft heating, the copper substrates could be easily repaired, regenerating the surfaces' superrepellency to liquids. The micro- and nanoroughness structures generated on copper surfaces by the deposition of silver dendrites functionalized with TMPSi presented apparent contact angles greater than 150° with a contact angle hysteresis lower than 10° when water was used as the test liquid. To avoid total wettability with very low surface tension liquids, such as rapeseed oil and hexadecane, a thin perfluorinated coating of poly(tetrafluoroethylene) (PTFE), produced by physical vapor deposition, was used. A more efficient perfluorinated coating was obtained when PFDTES was used. The superomniphobic surfaces produced apparent contact angles above 150° with all of the tested liquids, including hexadecane, although the contact angle hysteresis with this liquid was above 10°. The coupling of dendritic structures with TMPSi/PTFE or directly by PFDTES coatings was responsible for the superrepellency of the as-prepared surfaces. These simple, fast, and reliable procedures allow the large area, and cost-effective scale fabrication of superrepellent surfaces on copper substrates for various industrial applications with the advantage of easy recovery of the surface repellency after damage. PMID:25714008
Gopal, Purva; Shah, Rajal B
2015-09-01
Context .- The incidence of syphilis is on the rise, particularly in male patients who are human immunodeficiency virus (HIV) positive, and men who have sex with men. Objective .- To describe 4 cases of primary syphilis presenting in the anal canal to increase awareness of its presentation and morphology in this location, as the diagnosis can be easily overlooked clinically and by the pathologist. Design .- Clinical presentation, hematoxylin-eosin-stained sections, and Treponema pallidum immunohistochemical staining were reviewed in detail in all 4 cases. Results .- Three patients presented with anal canal ulcers; one presented with an ulcerated anal mass. All 4 patients were male, of whom 2 were HIV positive. Syphilis was clinically suspected in only 1 case; in 2 cases, confirmatory evaluation and treatment were prompted by pathologic diagnosis. In the fourth case, syphilis was diagnosed serologically at time of biopsy; however, the patient had an anal mass, and malignancy was clinically suspected. All 4 cases had bandlike chronic plasma cell-rich inflammation at the squamous epithelium and lamina propria junction; 2 cases had poorly formed granulomas. One case had concomitant rectal biopsy specimens with proctitis. Treponema pallidum immunohistochemistry highlighted homing of organisms in a perivascular pattern and at the junction of squamous epithelium and lamina propria. Conclusions .- Syphilis should be considered in the differential diagnosis of anal canal ulcers, anorectal inflammatory masses, and proctitis. Detailed knowledge of clinical history and recognition of the characteristic pattern of inflammation by the pathologist is important. Treponema pallidum immunohistochemical staining can help avoid a missed diagnosis of syphilis, which, if left unrecognized, can progress to late-stage disease with serious complications. PMID:26317454
Easily-handled method to isolate mesenchymal stem cells from coagulated human bone marrow samples
Wang, Heng-Xiang; Li, Zhi-Yong; Guo, Zhi-Kun; Guo, Zi-Kuan
2015-01-01
AIM: To establish an easily-handled method to isolate mesenchymal stem cells (MSCs) from coagulated human bone marrow samples. METHODS: Thrombin was added to aliquots of seven heparinized human bone marrow samples to mimic marrow coagulation. The clots were untreated, treated with urokinase or mechanically cut into pieces before culture for MSCs. The un-coagulated samples and the clots were also stored at 4?°C for 8 or 16 h before the treatment. The numbers of colony-forming unit-fibroblast (CFU-F) in the different samples were determined. The adherent cells from different groups were passaged and their surface profile was analyzed with flow cytometry. Their capacities of in vitro osteogenesis and adipogenesis were observed after the cells were exposed to specific inductive agents. RESULTS: The average CFU-F number of urokinase-treated samples (16.85 ± 11.77/106) was comparable to that of un-coagulated control samples (20.22 ± 10.65/106, P = 0.293), which was significantly higher than those of mechanically-cut clots (6.5 ± 5.32/106, P < 0.01) and untreated clots (1.95 ± 1.86/106, P < 0.01). The CFU-F numbers decreased after samples were stored, but those of control and urokinase-treated clots remained higher than the other two groups. Consistently, the numbers of the attached cells at passage 0 were higher in control and urokinase-treated clots than those of mechanically-cut clots and untreated clots. The attached cells were fibroblast-like in morphology and homogenously positive for CD44, CD73 and CD90, and negative for CD31 and CD45. Also, they could be induced to differentiate into osteoblasts and adipocytes in vitro. CONCLUSION: Urokinase pretreatment is an optimal strategy to isolate MSCs from human bone marrow samples that are poorly aspirated and clotted.
Clearly written, easily comprehended? The readability of websites providing information on epilepsy.
Brigo, Francesco; Otte, Willem M; Igwe, Stanley C; Tezzon, Frediano; Nardone, Raffaele
2015-03-01
There is a general need for high-quality, easily accessible, and comprehensive health-care information on epilepsy to better inform the general population about this highly stigmatized neurological disorder. The aim of this study was to evaluate the health literacy level of eight popular English-written websites that provide information on epilepsy in quantitative terms of readability. Educational epilepsy material on these websites, including 41 Wikipedia articles, were analyzed for their overall level of readability and the corresponding academic grade level needed to comprehend the published texts on the first reading. The Flesch Reading Ease (FRE) was used to assess ease of comprehension while the Gunning Fog Index, Coleman-Liau Index, Flesch-Kincaid Grade Level, Automated Readability Index, and Simple Measure of Gobbledygook scales estimated the corresponding academic grade level needed for comprehension. The average readability of websites yielded results indicative of a difficult-to-fairly-difficult readability level (FRE results: 44.0±8.2), with text readability corresponding to an 11th academic grade level (11.3±1.9). The average FRE score of the Wikipedia articles was indicative of a difficult readability level (25.6±9.5), with the other readability scales yielding results corresponding to a 14th grade level (14.3±1.7). Popular websites providing information on epilepsy, including Wikipedia, often demonstrate a low level of readability. This can be ameliorated by increasing access to clear and concise online information on epilepsy and health in general. Short "basic" summaries targeted to patients and nonmedical users should be added to articles published in specialist websites and Wikipedia to ease readability. PMID:25601720
Hijnen, W A M; Biraud, D; Cornelissen, E R; van der Kooij, D
2009-07-01
One of the major impediments in the application of spiral-wound membranes in water treatment or desalination is clogging of the feed channel by biofouling which is induced by nutrients in the feedwater. Organic carbon is, under most conditions, limiting the microbial growth. The objective of this study is to assess the relationship between the concentration of an easily assimilable organic compound such as acetate in the feedwater and the pressure drop increase in the feed channel. For this purpose the membrane fouling simulator (MFS) was used as a model for the feed channel of a spiral-wound membrane. This MFS unit was supplied with drinking water enriched with acetate at concentrations ranging from 1 to 1000 microg C x L(-1). The pressure drop (PD) in the feed channel increased at all tested concentrations but not with the blank. The PD increase could be described by a first order process based on theoretical considerations concerning biofilm formation rate and porosity decline. The relationship between the first order fouling rate constant R(f) and the acetate concentration is described with a saturation function corresponding with the growth kinetics of bacteria. Under the applied conditions the maximum R(f) (0.555 d(-1)) was reached at 25 microg acetate-C x L(-1) and the half saturation constant k(f) was estimated at 15 microg acetate-C x L(-1). This value is higher than k(s) values for suspended bacteria grown on acetate, which is attributed to substrate limited growth conditions in the biofilm. The threshold concentration for biofouling of the feed channel is about 1 microg acetate-C x L(-1). PMID:19673281
Neti, Girija; Novak, Stefanie M.; Thompson, Valery F.; Goll, Darrel E.
2009-01-01
Myofibrillar proteins must be removed from the myofibril before they can be turned over metabolically in functioning muscle cells. It is uncertain how this removal is accomplished without disruption of the contractile function of the myofibril. It has been proposed that the calpains could remove the outer layer of filaments from myofibrils as a first step in myofibrillar protein turnover. Several studies have found that myofilaments can be removed from myofibrils by trituration in the presence of ATP. These easily releasable myofilaments (ERMs) were proposed to be intermediates in myofibrillar protein turnover. It was unclear, however, whether the ERMs were an identifiable entity in muscle or whether additional trituration would remove more myofilaments until the myofibril was gone and whether calpains could release ERMs from intact myofibrils. The present study shows that few ERMs could be obtained from the residue after the first removal of ERMs, and the yield of ERMs from well-washed myofibrils was reduced, probably because some ERMs had been removed by the washing process. Mild calpain treatment of myofibrils released filaments that had a polypeptide composition and were ultrastructurally similar to ERMs. The yield of calpain-released ERMs was two- to threefold greater than the normal yield. Hence, ERMs are an identifiable entity in myofibrils, and calpain releases filaments that are similar to ERMs. The role of ERMs in myofibrillar protein turnover is unclear, because only filaments on the surface of the myofibril would turn over, and changes in myofibrillar protein isoforms during development could not occur via the ERM mechanism. PMID:19321741
Debenay, J-P; Della Patrona, L; Herbland, A; Goguenheim, H
2009-01-01
This study was carried out in shrimp ponds from New Caledonia, in order to determine the cause of the exceptional proportion of abnormal tests (FAI) (often >50%, sometimes >80%). FAI was positively correlated to the quantity of easily oxidized material (EOM) deposited on the bottom of the ponds and to the sediment oxygen demand, and negatively correlated to redox. These results suggest that a very high FAI is a potential indicator for great accumulations of native organic matter, leading to a high sediment oxygen demand. When studying ancient sediments in core samples, exceptional abundances of abnormal tests may indicate periods of high accumulation of EOM, and therefore of oxygen depletion. This finding should help in better management of aquaculture ponds, but should also allow new insight into the interpretation of sedimentary records, providing a useful proxy for paleoenvironmental reconstructions. PMID:19735926
NSDL National Science Digital Library
2007-10-03
Students will learn about Alcoholics Anonymous and prepare to interpret for a deaf member at a traditional AA meeting. Interpreting for Alcoholics Anonymous (AA) requires that the interpreter have an understanding of the purpose of the meetings, prepares adequately for frozen text and informal register that will be used, and has respect for the organization and its members. It can be a difficult, but rewarding assignment. Preparing to ...
Salzman, Daniel
. For a customized price quote, please email Vincent Aliberto, Video Services Manager, at va118@columbia to easily build a website and manage submission of papers and participant registration. Support and pricing
ERIC Educational Resources Information Center
Christensen, Timothy J.; Labov, Jay B.
1997-01-01
Details the construction of a viewing chamber for fruit flies that connects to a dissecting microscope and features a design that enables students to easily move fruit flies in and out of the chamber. (DDR)
Customizable tool for ecological data entry, assessment, monitoring, and interpretation
Technology Transfer Automated Retrieval System (TEKTRAN)
The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...
Natural image statistics for computer graphics
Reinhard, Erik
Natural image statistics for computer graphics Erik Reinhard, Peter Shirley and Tom Troscianko UUCS of natural images can be statistically modeled, revealing striking regularities. The human visual system to interpret images which conform to these statistics. Research has shown that images that do not statistically
Psychological testing of sign language interpreters.
Seal, Brenda C
2004-01-01
Twenty-eight sign language interpreters participated in a battery of tests to determine if a profile of cognitive, motor, attention, and personality attributes might distinguish them as a group and at different credential levels. Eight interpreters held Level II and nine held Level III Virginia Quality Assurance Screenings (VQAS); the other 11 held Registry of Interpreters for the Deaf (RID) certification. Six formal tests, the Quick Neurological Screening Test-II, the Wonderlic Personnel Test, the Test of Visual-Motor Skills (TVMS), the d2 Test of Attention, the Integrated Visual and Auditory Continuous Performance Test, and the Sixteen Personality Factor Questionnaire (16PF), were administered to the interpreters. Average scores were high on most of the tests; differences across the three groups were not statistically significant. Results from only one test, the d2 Test of Attention, were significantly correlated with interpreter level. Comparisons between educational and community interpreters also revealed no differences. Personality traits were widely distributed, but one trait, abstract reasoning, tested extremely high in 18 interpreters. Discussion of the potential implications of these results, particularly for educational interpreters, is offered. PMID:15304401
Neural network classification - A Bayesian interpretation
NASA Technical Reports Server (NTRS)
Wan, Eric A.
1990-01-01
The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical framework.
Biosignal pattern recognition and interpretation systems
E. J. Ciaccio; S. M. Dunn; M. Akay
1993-01-01
A general framework is given to describe pattern recognition and interpretation. Pattern analysis stages are described, with consideration of difficulties in implementation and uncertainties present at each level. The main forms of pattern analysis-statistical, syntactic, and artificial intelligence (connectionist and symbolic) methods-have different strengths and weaknesses, depending on the stage of pattern analysis at which they are used. In general,
Interpreting Face Images Using Active Appearance Models
Gareth J. Edwards; Christopher J. Taylor; Timothy F. Cootes
1998-01-01
We demonstrate a fast, robust methodof interpreting face images using an Active Appearance Model (AAM). An AAM contains a statistical model of shape and grey-level appear- ance which can generalise to almost any face. Matching to an image involves findingmodel parameters which min- imise the difference between the image and a synthesised face. We observe that displacing each model parameter
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM Joint seminar with Stevanovich Center PHILIPPE RIGOLLET Operations Research and Financial Engineering, Princeton University The Statistical Price to Pay ABSTRACT Computational limitations of statistical problems have largely been ignored or simply over- come
ERIC Educational Resources Information Center
Erekson, James A.
2010-01-01
Prosody is a means for "reading with expression" and is one aspect of oral reading competence. This theoretical inquiry asserts that prosody is central to interpreting text, and draws distinctions between "syntactic" prosody (for phrasing) and "emphatic" prosody (for interpretation). While reading with expression appears as a criterion in major…
Collapse challenge for interpretations
Neumaier, Arnold
Collapse challenge for interpretations of quantum mechanics Arnold Neumaier FakultÂ¨at f: http://www.mat.univie.ac.at/neum/ Abstract. The collapse challenge for interpretations of quantum analysis that completely explains the experimental result. The challenge is explained in detail
Collapse challenge for interpretations
Neumaier, Arnold
Collapse challenge for interpretations of quantum mechanics Arnold Neumaier FakultË?at f: http://www.mat.univie.ac.at/#neum/ Abstract. The collapse challenge for interpretations of quantum analysis that completely explains the experimental result. The challenge is explained in detail
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition. PMID:22616479
Groen-Blokhuis, Maria M; Middeldorp, Christel M; M van Beijsterveldt, Catharina E; Boomsma, Dorret I
2011-10-01
In order to estimate the influence of genetic and environmental factors on 'crying without a cause' and 'being easily upset' in 2-year-old children, a large twin study was carried out. Prospective data were available for ~18,000 2-year-old twin pairs from the Netherlands Twin Register. A bivariate genetic analysis was performed using structural equation modeling in the Mx software package. The influence of maternal personality characteristics and demographic and lifestyle factors was tested to identify specific risk factors that may underlie the shared environment of twins. Furthermore, it was tested whether crying without a cause and being easily upset were predictive of later internalizing, externalizing and attention problems. Crying without a cause yielded a heritability estimate of 60% in boys and girls. For easily upset, the heritability was estimated at 43% in boys and 31% in girls. The variance explained by shared environment varied between 35% and 63%. The correlation between crying without a cause and easily upset (r = .36) was explained both by genetic and shared environmental factors. Birth cohort, gestational age, socioeconomic status, parental age, parental smoking behavior and alcohol use during pregnancy did not explain the shared environmental component. Neuroticism of the mother explained a small proportion of the additive genetic, but not of the shared environmental effects for easily upset. Crying without a cause and being easily upset at age 2 were predictive of internalizing, externalizing and attention problems at age 7, with effect sizes of .28-.42. A large influence of shared environmental factors on crying without a cause and easily upset was detected. Although these effects could be specific to these items, we could not explain them by personality characteristics of the mother or by demographic and lifestyle factors, and we recognize that these effects may reflect other maternal characteristics. A substantial influence of genetic factors was found for the two items, which are predictive of later behavioral problems. PMID:21962130
The Perils of Provocative Statistics.
ERIC Educational Resources Information Center
Scanlan, James P.
1991-01-01
Demonstrates that conclusions drawn from statistics concerning racial disparities in income, infant mortality, sports participation, and other areas may be partly or wholly wrong, because disparities generally increase as conditions improve. Observes that flawed uses of statistics abound; points out some common errors in interpretation. (DM)
SOCR: Statistics Online Computational Resource
ERIC Educational Resources Information Center
Dinov, Ivo D.
2006-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…
Motivating Play Using Statistical Reasoning
ERIC Educational Resources Information Center
Cross Francis, Dionne I.; Hudson, Rick A.; Lee, Mi Yeon; Rapacki, Lauren; Vesperman, Crystal Marie
2014-01-01
Statistical literacy is essential in everyone's personal lives as consumers, citizens, and professionals. To make informed life and professional decisions, students are required to read, understand, and interpret vast amounts of information, much of which is quantitative. To develop statistical literacy so students are able to make sense of…
How to Avoid Statistical Traps
ERIC Educational Resources Information Center
Bracey, Gerald W.
2006-01-01
Education statistics are rarely neutral; those who collect and analyze them have different purposes. In this article, Bracey discusses several principles of data interpretation to help educators avoid falling into statistical traps. For example, because such reports as A Nation At Risk contain many "selected, spun, distorted, and even manufactured…
NASA Astrophysics Data System (ADS)
Paine, Gregory Harold
1982-03-01
The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better understanding of the behavior of these systems.
Geological interpretation of a Gemini photo
Hemphill, William R.; Danilchik, Walter
1968-01-01
Study of the Gemini V photograph of the Salt Range and Potwar Plateau, West Pakistan, indicates that small-scale orbital photographs permit recognition of the regional continuity of some geologic features, particularly faults and folds that could he easily overlooked on conventional air photographs of larger scale. Some stratigraphic relationships can also be recognized on the orbital photograph, but with only minimal previous geologic knowledge of the area, these interpretations are less conclusive or reliable than the interpretation of structure. It is suggested that improved atmospheric penetration could be achieved through the use of color infrared film. Photographic expression of topography could also be improved by deliberately photographing some areas during periods of low sun angle.
NASA Technical Reports Server (NTRS)
Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)
2001-01-01
The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.
BCSC Screening Performance Benchmarks: Abnormal Interpretations (2007 Data)
Skip to Main Content Home | Data | Statistics | Tools | Collaborations | Work with Us | Publications | About | Links Abnormal Interpretations for 4,032,556 Screening Mammography Examinations from 1996- 2005 --- based on
Statistical Applets: The Reasoning of a Statistical Test
NSDL National Science Digital Library
Duckworth, William
Created by authors Duckworth, McCabe, Moore and Sclove for W.H. Freeman and Co., this statistical demo is designed to help students master "The Reasoning of a Statistical Test" covered in the "Practice of Business Statistics." This site was intended to accompany this textbook, but the text is not necessary for its use. The applet itself presents a basketball player shooting free throws. The player takes twenty-five shots and then this data is analyzed versus its null hypothesis. This is an excellent example of hypothesis testing in a format that students can easily comprehend.
Effects of Antecedent Distance and Intervening Text Structure in the Interpretation of Ellipses.
ERIC Educational Resources Information Center
Garnham, A.
1987-01-01
Investigates the availability of surface representations for the interpretation of verb-phrase ellipsis. Results show that an elliptical verb phrase is most easily interpreted if its antecedent is in the immediately preceding sentence and that this can not be explained in terms of the unnaturalness of the passages with distant antecedents. (MM)
Genetics 2010 SWGDAM Guidelines · In 2010, the Scientific Working Group on DNA Analysis Methods (SWGDAM mixture. Determining the number of contributors to a mixture is one of the first steps in interpretation
ERIC Educational Resources Information Center
Smith, P. Sean; Ford, Brent A.
1994-01-01
Presents a brief introduction of our atmosphere, a guide to reading and interpreting weather maps, and a set of activities to facilitate teachers in helping to enhance student understanding of the Earth's atmosphere. (ZWH)
BIOMONITORING: INTERPRETATION AND USES
With advanced technologies, it is now possible to measure very low levels of many chemicals in biological fluids. However, the appropriate use and interpretation of biomarkers will depend upon many factors associated with the exposure, adsorption, deposition, metabolism, and eli...
Interpretation of Biosphere Reserves.
ERIC Educational Resources Information Center
Merriman, Tim
1994-01-01
Introduces the Man and the Biosphere Programme (MAB) to monitor the 193 biogeographical provinces of the Earth and the creation of biosphere reserves. Highlights the need for interpreters to become familiar or involved with MAB program activities. (LZ)
Programs for Training Interpreters.
ERIC Educational Resources Information Center
American Annals of the Deaf, 2003
2003-01-01
This listing provides directory information on U.S. programs for training interpreters for individuals with deafness. Schools are listed by state and include director and degree information. (Author/CR)
Cancer Statistics Cancer has a major impact on society in the United States and across the world. Cancer statistics ... prognosis, see the Understanding Cancer Prognosis page. Cancer Statistics | Did You Know? View this video on YouTube. ...
Interpreter-mediated dentistry.
Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F
2015-05-01
The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals. PMID:25828074
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM PO-LING LOH Department of Statistics University the seminar in Eckhart 110 ABSTRACT Noisy and missing data are prevalent in many real-world statistical, and provide theoretical guarantees for the statistical consistency of our methods. Although our estimators
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM ERIC KOLACZYK Department of Statistics Boston University Statistical Analysis of Network Data: (Re)visiting the Foundations MONDAY, October 13, 2014, at 4, statistical methods and modeling have been central to these efforts. But how well do we truly understand
32. Statistics 1 32. STATISTICS
Masci, Frank
32. Statistics 1 32. STATISTICS Revised September 2007 by G. Cowan (RHUL). This chapter gives an overview of statistical methods used in High Energy Physics. In statistics, we are interested in using's validity or to determine the values of its parameters. There are two main approaches to statistical
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM SAYAN MUKHERJEE Department of Statistical Science vignettes where topological ideas are explored in statistical models of complex traits, machine learning such as sufficient statistics and dictionary learning will be touched on. I will describe an application
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM NOUREDDINE EL KAROUI Department of Statistics will discuss the behavior of widely used statistical methods in the high-dimensional setting where the number surprising statistical phenomena occur: for instance, maximum likelihood methods are shown to be (grossly
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM ERNST WIT Statistics and Probability University devices collect a lot of information, typically about few independent statistical subjects or units statistics. In certain special cases the method can be tweaked to obtain L1-penalized GLM solution paths
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM GONGJUN XU Department of Statistics Columbia University Statistical Inference for Diagnostic Classification Models MONDAY, February 18, 2013 at 4:00 PM-driven construction (estimation) of the Q-matrix and related statistical issues of DCMs. I will first give
NASA Astrophysics Data System (ADS)
Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo
2015-04-01
A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar technology. Anyone (included farmers, technicians or who work at Civil Protection) who has a good smartphone can take photographs and, from these photographs, they can easily obtain high-resolution DSMs. Therefore, SfM technique accomplished with smartphones can be a very strategic tool for post-event field surveys, to increase the existing knowledge on such events, and to provide fast technical solutions for risk mitigation (e.g. landslide and flood risk management). The future challenge consists of using only a smartphone for local scale post-event analyses. This can be even enhanced by the development of specific apps that are able to build quickly a 3D view of the case study and arrange a preliminary quantitative analysis of the process involved, ready to be sent to Civil Protection for further elaborations. Tarolli, P. (2014). High-resolution topography for understanding Earth surface processes: opportunities and challenges. Geomorphology, 216, 295-312, doi:10.1016/j.geomorph.2014.03.008.
Kreinovich, Vladik
EASILY Driss Misane a,1 and Vladik Kreinovich b a Universitâ??e Mohammed V, Facultâ??e des Sciences Dâ??epartement of Mathâ??ematiques, Rabat, Morocco b Computer Science Department, University of Texas at El Paso, El Paso
Kreinovich, Vladik
EASILY Driss Misanea,1 and Vladik Kreinovichb a UniversitÂ´e Mohammed V, FacultÂ´e des Sciences DÂ´epartement of MathÂ´ematiques, Rabat, Morocco b Computer Science Department, University of Texas at El Paso, El Paso
Tang, Wei-Jun; Yang, Nian-Fa; Yi, Bing; Deng, Guo-Jun; Huang, Yi-Yong; Fan, Qing-Hua
2004-06-21
A new switched biphasic catalysis system for highly effective olefin dihydroxylation has been described, in which the dendritic osmium catalyst preferred to dissolve in the non-polar organic layer and could be easily separated from the polar diol products through phase separation induced by addition of water at the end of the reaction. PMID:15179473
Carlos Lopez
2015-09-02
A local interpretation of quantum mechanics is presented. Its main ingredients are: first, a label attached to one of the virtual paths in the path integral formalism, determining the output for measurement of position or momentum; second, a mathematical model for spin states, equivalent to the path integral formalism for point particles in space time, with the corresponding label. The mathematical machinery of orthodox quantum mechanics is maintained, in particular amplitudes of probability and Born's rule; therefore, Bell's type inequalities theorems do not apply. It is shown that statistical correlations for pairs of particles with entangled spins have a description completely equivalent to the two slit experiment, that is, interference (wave like behaviour) instead of non locality gives account of the process. The interpretation is grounded in the experimental evidence of a point like character of electrons, and in the hypothetical existence of a wave like, the de Broglie, companion system. A correspondence between the extended Hilbert spaces of hidden physical states and the orthodox quantum mechanical Hilbert space shows the mathematical equivalence of both theories. Paradoxical behaviour with respect to the action reaction principle is analysed, and an experimental set up, modified two slit experiment, proposed to look for the companion system.
Considerations When Working with Interpreters.
ERIC Educational Resources Information Center
Hwa-Froelich, Deborah A.; Westby, Carol E.
2003-01-01
This article describes the current training and certification procedures in place for linguistic interpreters, the continuum of interpreter roles, and how interpreters' perspectives may influence the interpretive interaction. The specific skills needed for interpreting in either health care or educational settings are identified. A table compares…
BCSC Performance Benchmarks: Abnormal Interpretations by Indication for Examination (2009 Data)
Skip to Main Content Home | Data | Statistics | Tools | Collaborations | Work with Us | Publications | About | Links Abnormal Interpretations by Indication for Examination for 363,048 Diagnostic Mammography Examinations
Skip to Main Content Home | Data | Statistics | Tools | Collaborations | Work with Us | Publications | About | Links Benchmarks for Abnormal Screening Mammography Interpretations from 2004 - 2008 -- based on BCSC data
Skip to Main Content Home | Data | Statistics | Tools | Collaborations | Work with Us | Publications | About | Links Benchmarks for Abnormal Screening Mammography Interpretations from 1996 - 2005 --- based on BCSC data
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM YUVAL BENJAMINI Department of Statistics University) and imaging (e.g. functional MRI) data from the visual cortex. These encoding models are trained to describe
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM PETER GUTTORP University of Washington and Norwegian Computing Center The Heat Is On! A Statistical Look at the State of the Climate MONDAY, May 6, 2013 at 4
Hospitals as interpretation systems.
Thomas, J B; McDaniel, R R; Anderson, R A
1991-01-01
In this study of 162 hospitals, it was found that the chief executive officer's (CEO's) interpretation of strategic issues is related to the existing hospital strategy and the hospital's information processing structure. Strategy was related to interpretation in terms of the extent to which a given strategic issue was perceived as controllable or uncontrollable. Structure was related to the extent to which an issue was defined as positive or negative, was labeled as controllable or uncontrollable, and was perceived as leading to a gain or a loss. Together, strategy and structure accounted for a significant part of the variance in CEO interpretations of strategic events. The theoretical and managerial implications of these findings are discussed. PMID:1991677
The ADAMS interactive interpreter
Rietscha, E.R.
1990-12-17
The ADAMS (Advanced DAta Management System) project is exploring next generation database technology. Database management does not follow the usual programming paradigm. Instead, the database dictionary provides an additional name space environment that should be interactively created and tested before writing application code. This document describes the implementation and operation of the ADAMS Interpreter, an interactive interface to the ADAMS data dictionary and runtime system. The Interpreter executes individual statements of the ADAMS Interface Language, providing a fast, interactive mechanism to define and access persistent databases. 5 refs.
NSDL National Science Digital Library
Hoffman, Howard
This site contains 100 modules designed to introduce concepts in statistics. The modules are divided into categories such as descriptive statistics, inferential statistics, related measures, enumeration statistics and ANOVA. Click the green button on the side to start the modules, then click "Main Menu" at the top to see a list of topics. Topics include: describing numbers, normal curve, sampling distributions, hypothesis testing, regression and Chi-Square. The site also includes a glossary, statistical tables and simulations, and a personalized progress report.
The National Lakes Assessment (NLA) and other lake survey and monitoring efforts increasingly rely upon biological assemblage data to define lake condition. Information concerning the multiple dimensions of physical and chemical habitat is necessary to interpret this biological ...
Interpreting Carnivore Scent-Station Surveys
NSDL National Science Digital Library
The Northern Prairie Wildlife Research Center (NPWRC) has placed online a 1998 scientific publication from the Journal of Wildlife Management (Vol 62(4):1235-1245) entitled "Interpreting Carnivore Scent-Station Surveys." In it, the authors analyze a subset of data from the Minnesota carnivore scent-station survey collected during 1986-93, "to determine statistical properties and to examine analyses of scent-station data." This resource, of interest to wildlife managers, may be downloaded as a .zip files.
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM SRIRAM SANKARARAMAN Department of Genetics Harvard Medical School Statistical Models for Analyzing Ancient Human Admixture WEDNESDAY, January 21, 2015, at 4 become available, as well as appropriate statistical models. In the first part of my talk, I will focus
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM PIOTR ZWIERNIK Department of Mathematics University of Genoa Understanding Statistical Models Through Their Geometry MONDAY, January 26, 2015, at 4:00 PM and Gaussian statistical models have a rich geometric structure and can be often viewed as algebraic sets
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM ROBERT NOWAK Department of Electrical and Computer-dimensional statistical models to capture the complexity of such problems. Most of the work in this direction has focused of statistical inference. These procedures automatically adapt the measurements in order to focus and optimize
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM INGRAM OLKIN Department of Statistics Stanford the concept of majorization is called mixing and in physics it is referred to as chaotic (one vector is more probability, statistics, combinatorics and graphs, numerical analysis and matrix theory. Special emphasis
Computer-interpretable guidelines.
Hasman, Arie
2013-01-01
In this contribution the concept of computer-interpretable guidelines is discussed. Several guideline formalisms are presented and the GASTON and GASTINE formalisms are given as examples. Finally the problems associated with the integration of CIGs with EPR systems are mentioned. PMID:23823358
Interpreting the Constitution.
ERIC Educational Resources Information Center
Brennan, William J., Jr.
1987-01-01
Discusses constitutional interpretations relating to capital punishment and protection of human dignity. Points out the document's effectiveness in creating a new society by adapting its principles to current problems and needs. Considers two views of the Constitution that lead to controversy over the legitimacy of judicial decisions. (PS)
Explaining the Interpretive Mind.
ERIC Educational Resources Information Center
Brockmeier, Jens
1996-01-01
Examines two prominent positions in the epistemological foundations of psychology--Piaget's causal explanatory claims and Vygotsky's interpretive understanding; contends that they need to be placed in their wider philosophical contexts. Argues that the danger of causally explaining cultural practices through which human beings construct and…
available for the purpose. #12;Steps in Forensic DNA Testing Extraction/ Quantitation Amplification/ Marker & Storage Buccal swabBlood Stain DNA Extraction & Quantitation Multiplex PCR Amplification of STR MarkersDNA Mixture Interpretation John M. Butler, Ph.D. National Institute of Standards and Technology
ERIC Educational Resources Information Center
Layton, Lyn; Miller, Carol
2004-01-01
The National Literacy Strategy (NLS) was introduced into schools in England in 1998 with the aim of raising the literacy attainments of primary-aged children. The Framework for Teaching the Literacy Hour, a key component of the NLS, proposes an interpretation of literacy that emphasises reading, writing and spelling skills. An investigation of the…
Interpretative Dimensions of Art
ERIC Educational Resources Information Center
Mickunas, Algis
1975-01-01
The thesis suggested in this paper points to a worldly way of thinking in which a particular interpretative context provides not only a framework for an understanding of various civilizations, but also, correlatively, a multitude of ways of understanding space, time, and motion. (Author)
Interpreting & Biomechanics. PEPNet Tipsheet
ERIC Educational Resources Information Center
PEPNet-Northeast, 2001
2001-01-01
Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…
Listening and Message Interpretation
ERIC Educational Resources Information Center
Edwards, Renee
2011-01-01
Message interpretation, the notion that individuals assign meaning to stimuli, is related to listening presage, listening process, and listening product. As a central notion of communication, meaning includes (a) denotation and connotation, and (b) content and relational meanings, which can vary in ambiguity and vagueness. Past research on message…
Kao, Chung Min
1989-01-01
AND TEST SEOI:ENCE . 22 1. Test of Product Lines . 2. Test of Output Lines 3. Test of Bit Lines Test of Complemented Bit Lines Test of True Bit Lines 3) 4:3 IS 46 VIII. RESI LTS IX. PROBLEMS CHAPTER Page X. FI. 'TURE IMPROVEMENTS XI... than the common stuck-at faults [4). Existing testing algorithms, such as the D-algorithm [6]. , PODE1VI 7] and FAN [8', , cannot efficiently test or localize faults inside s. PLA. Vote that the testing and diagnosing problems are quite different...
NASA Astrophysics Data System (ADS)
Fink, Thomas
2015-03-01
We introduce a simple class of distribution networks which withstand damage by being repairable instead of redundant. Instead of asking how hard it is to disconnect nodes through damage, we ask how easy it is to reconnect nodes after damage. We prove that optimal networks on regular lattices have an expected cost of reconnection proportional to the lattice length, and that such networks have exactly three levels of structural hierarchy. We extend our results to networks subject to repeated attacks, in which the repairs themselves must be repairable. We find that, in exchange for a modest increase in repair cost, such networks are able to withstand any number of attacks. We acknowledge support from the Defense Threat Reduction Agency, BCG and EU FP7 (Growthcom).
Statistics as a dynamical attractor
Michail Zak
2012-08-30
It is demonstrated that any statistics can be represented by an attractor of the solution to a corresponding systen of ODE coupled with its Liouville equation. Such a non-Newtonian representation allows one to reduce foundations of statistics to better established foundations of ODE. In addition to that, evolution to the attractor reveals possible micro-mechanisms driving random events to the final distribution of the corresponding statistical law. Special attention is concentrated upon the power law and its dynamical interpretation: it is demonstrated that the underlying dynamics supports a " violent reputation" of the power law statistics.
Cosmic statistics of statistics
NASA Astrophysics Data System (ADS)
Szapudi, István; Colombi, Stéphane; Bernardeau, Francis
1999-12-01
The errors on statistics measured in finite galaxy catalogues are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly non-linear to weakly non-linear scales. For non-linear functions of unbiased estimators, such as the cumulants, the phenomenon of cosmic bias is identified and computed. Since it is subdued by the cosmic errors in the range of applicability of the theory, correction for it is inconsequential. In addition, the method of Colombi, Szapudi & Szalay concerning sampling effects is generalized, adapting the theory for inhomogeneous galaxy catalogues. While previous work focused on the variance only, the present article calculates the cross-correlations between moments and connected moments as well for a statistically complete description. The final analytic formulae representing the full theory are explicit but somewhat complicated. Therefore we have made available a fortran program capable of calculating the described quantities numerically (for further details e-mail SC at colombi@iap.fr). An important special case is the evaluation of the errors on the two-point correlation function, for which this should be more accurate than any method put forward previously. This tool will be immensely useful in the future for assessing the precision of measurements from existing catalogues, as well as aiding the design of new galaxy surveys. To illustrate the applicability of the results and to explore the numerical aspects of the theory qualitatively and quantitatively, the errors and cross-correlations are predicted under a wide range of assumptions for the future Sloan Digital Sky Survey. The principal results concerning the cumulants ?, Q3 and Q4 is that the relative error is expected to be smaller than 3, 5 and 15per cent, respectively, in the scale range of 1-10h-1Mpc the cosmic bias will be negligible.
Reverse Causation and the Transactional Interpretation of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Cramer, John G.
2006-10-01
In the first part of the paper we present the transactional interpretation of quantum mechanics, a method of viewing the formalism of quantum mechanics that provides a way of visualizing quantum events and experiments. In the second part, we present an EPR gedankenexperiment that appears to lead to observer-level reverse causation. A transactional analysis of the experiment is presented. It easily accounts for the reported observations but does not reveal any barriers to its modification for reverse causation.
... About CDC.gov . Fungal Diseases Share Compartir Histoplasmosis Statistics How common is histoplasmosis? In the United States, ... Risk & Prevention Sources Diagnosis & Testing Treatment Health Professionals Statistics More Resources Blastomycosis Definition Symptoms People at Risk & ...
... Lighters and Fireworks Other Home Appliances, Maintenance and Construction September 11, 2013 CPSC Explores Proximity Detector For ... Gasoline Containers View All Home Appliances, Maintenance and Construction Injury Statistics Injury Statistics September 30, 2012 Submersions ...
On Estimating the Size and Confidence of a Statistical Audit
Rivest, Ronald L.
2007-04-22
We consider the problem of statistical sampling for auditing elections, and we develop a remarkably simple and easily-calculated upper bound for the sample size necessary for determining with probability at least c whether ...
Structural interpretation of seismic data and inherent uncertainties
NASA Astrophysics Data System (ADS)
Bond, Clare
2013-04-01
Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.
Evaluation of Psychotherapeutic Interpretations
POGGE, DAVID L.; DOUGHER, MICHAEL J.
1992-01-01
If much psychotherapy literature goes unread and unused by therapists, one reason may be the apparent irrelevance of theory-derived hypotheses to actual practice. Methods that uncover tacit knowledge that practicing therapists already possess can provide the empirical basis for more relevant theories and the testing of more meaningful hypotheses. This study demonstrates application of the phenomenological method to the question of evaluating psychotherapy. To discover how experienced psychotherapists evaluate interpretations made in actual psychotherapy sessions, therapists were asked to evaluate such interpretations from videotapes; analysis of responses yielded a set of 10 dimensions of evaluation. Such methods offer both practical utility and a source of theoretical growth anchored in the real world of the practicing therapist. PMID:22700101
NSDL National Science Digital Library
Braun, Mark
This tutorial is designed to aid first and second year medical students learn interpretation of the urinalysis. It includes material on how the test is done, its general application and pitfalls in interpretation. General introductory material is followed by a series of short clinical vignettes illustrating diagnostic application of the test with various medical conditions. QuickTime movie player and Java script runtime plug-in scripts are required for some pages. The tutorial concludes with a short self-help quiz covering the major points developed. The plug-ins noted above are available free at the following sites: http://www.apple.com/quicktime/download/win.html and http://www.sun.com/. Questions should be directed to Dr. Mark Braun (braunm@indiana.edu).
On statistical aspects of Qjets
NASA Astrophysics Data System (ADS)
Ellis, Stephen D.; Hornig, Andrew; Krohn, David; Roy, Tuhin S.
2015-01-01
The process by which jet algorithms construct jets and subjets is inherently ambiguous and equally well motivated algorithms often return very different answers. The Qjets procedure was introduced by the authors to account for this ambiguity by considering many reconstructions of a jet at once, allowing one to assign a weight to each interpretation of the jet. Employing these weighted interpretations leads to an improvement in the statistical stability of many measurements. Here we explore in detail the statistical properties of these sets of weighted measurements and demonstrate how they can be used to improve the reach of jet-based studies.
National Association for Interpretation
NSDL National Science Digital Library
NAI promotes the advancement of the profession of interpretation, a communication process used in on-site informal education programs at parks, zoos, nature centers, historic sites, museums, and aquaria. This site announces national and regional NAI conferences, workshops on diverse topics, skill certification programs, networking opportunities and job listing service. Includes membership information and application; can order newsletters, professional journals and books. Membership, program and publication fees apply.
Interpreting Geologic Sections
NSDL National Science Digital Library
Paul Morris
Athro, Limited is a for-profit corporation that publishes high school and college level biology, earth science, and geology course supplements and independent learning materials on the Web. This site provides instruction in interpreting the order of events in three hypothetical and one real geological section. For each section there is a list of events and an animation of the history of the section once the student has decided on the order of events.
Quantum entanglement and interference from classical statistics
C. Wetterich
2009-10-06
Quantum mechanics for a four-state-system is derived from classical statistics. Entanglement, interference, the difference between identical fermions or bosons and the unitary time evolution find an interpretation within a classical statistical ensemble. Quantum systems are subsystems of larger classical statistical systems, which include the environment or the vacuum. They are characterized by incomplete statistics in the sense that the classical correlation function cannot be used for sequences of measurements in the subsystem.
Data Interpretation: Using Probability
ERIC Educational Resources Information Center
Drummond, Gordon B.; Vowler, Sarah L.
2011-01-01
Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…
Query Interpretation and Representation
Chakrabarti, Soumen
clone institute institute dolly clone No or rare capitalization Rare to find quoted phrases losing riverblack swan summary dolly clone institute4 minutes lyric woodrow wilson president universitymother teresa-gram statistics from query logs and corpus · Phrase dictionaries, clicks #12;Chakrabarti 15 What makes w1w2
Term statistics Zipf's law text statistics
Lu, Jianguo
Term statistics Zipf's law text statistics October 20, 2014 text statistics 1 / 19 #12;Term statistics Zipf's law Overview 1 Term statistics 2 Zipf's law text statistics 2 / 19 #12;Term statistics Zipf's law Outline 1 Term statistics 2 Zipf's law text statistics 3 / 19 #12;Term statistics Zipf's law Model
Cai, Long
on a bio- logical system is compared. Experimental design is used to iden- tify data-collection schemes- ing time and resource costs low. In the next series of columns we will use statistical concepts
Luo,Y.; Tepikian, S.; Fischer, W.; Robert-Demolaize, G.; Trbojevic, D.
2009-01-02
Based on the contributions of the chromatic sextupole families to the half-integer resonance driving terms, we discuss how to sort the chromatic sextupoles in the arcs of the Relativistic Heavy Ion Collider (RHIC) to easily and effectively correct the second order chromaticities. We propose a method with 4 knobs corresponding to 4 pairs of chromatic sextupole families to online correct the second order chromaticities. Numerical simulation justifies this method, showing that this method reduces the unbalance in the correction strengths of sextupole families and avoids the reversal of sextupole polarities. Therefore, this method yields larger dynamic apertures for the proposed RHIC 2009 100GeV polarized proton run lattices.
NONE
1999-09-30
The Institute studied the adsorption of cationic pressure-sensitive adhesive (PSA) on wood fiber, and the buildup of PSA in a closed water system during paper recycling; the results are presented. Georgia Tech worked to develop an environmentally friendly polymerization process to synthesize a novel re-dispersible PSA by co-polymerizing an oil-soluble monomer (butyl acrylate) and a cationic monomer MAEPTAC; results are presented. At the University of Georgia at Athens the project focused on the synthesis of water-soluble and easily removable cationic polymer PSAs.
Hold My Calls: An Activity for Introducing the Statistical Process
ERIC Educational Resources Information Center
Abel, Todd; Poling, Lisa
2015-01-01
Working with practicing teachers, this article demonstrates, through the facilitation of a statistical activity, how to introduce and investigate the unique qualities of the statistical process including: formulate a question, collect data, analyze data, and interpret data.
GB 311 130 Business Statistics Spring 2012 (Online)
Diestel, Geoff
the student is able to learn, master, and apply knowledge while working within a mastery based pedagogical the foundations of statistics, by creating and interpreting basic statistical graphs and charts, calculating
Adapting internal statistical models for interpreting visual cues to depth
Knill, David C.
to slant, whose reliability depends on the distribution of aspect ratios in the world. As we manipulated with that of a broad class of Bayesian learning models. Keywords: cue integration, Bayesian priors, adaptation of the biggest puzzles in perception is how the brain reliably and accurately estimates properties of the world
Statistical interpretation of topographies and dynamics of multidimensional potentials
Berry, R. Stephen
in protein structure: the ``wrong'' structures so outnumber the ``cor- rect'' or physiologically active, such as particular crystal structures or folded protein structures. © 1995 American Institute of Physics. I structures of a protein that active proteins and hence organisms could not possibly exist if random search
The Statistical Literacy Needed to Interpret School Assessment Data
ERIC Educational Resources Information Center
Chick, Helen; Pierce, Robyn
2013-01-01
State-wide and national testing in areas such as literacy and numeracy produces reports containing graphs and tables illustrating school and individual performance. These are intended to inform teachers, principals, and education organisations about student and school outcomes, to guide change and improvement. Given the complexity of the…
Interpretation of psychophysics response curves using statistical physics.
Knani, S; Khalfaoui, M; Hachicha, M A; Mathlouthi, M; Ben Lamine, A
2014-05-15
Experimental gustatory curves have been fitted for four sugars (sucrose, fructose, glucose and maltitol), using a double layer adsorption model. Three parameters of the model are fitted, namely the number of molecules per site n, the maximum response RM and the concentration at half saturation C1/2. The behaviours of these parameters are discussed in relationship to each molecule's characteristics. Starting from the double layer adsorption model, we determined (in addition) the adsorption energy of each molecule on taste receptor sites. The use of the threshold expression allowed us to gain information about the adsorption occupation rate of a receptor site which fires a minimal response at a gustatory nerve. Finally, by means of this model we could calculate the configurational entropy of the adsorption system, which can describe the order and disorder of the adsorbent surface. PMID:24423561
Interpreting experiments on egg production--statistical considerations.
Billard, L; Song, E; Shim, M Y; Pesti, G M
2013-09-01
A given data set can be analyzed many ways, but only one is the correct analysis based on the design actually used when running the experiment. This work gives a tutorial-like illustration of the e?ects of the presence of a regression variable (or covariate) on the recorded responses in an experiment set up as a standard factorial design and shows how the analysis results are to be adjusted for the presence of covariates. An underlying assumption of a factorial model is that each of the treatments (e.g., diets) is randomly allocated to di?erent subjects (hens). When many measurements (e.g., over time) are made on the same subject (hen), this independence assumption is violated; in these cases, the design is an example from the class of repeated measures designs. The di?erence in analysis between factorial designs and repeated measures designs is also discussed. Then, the 2 concepts are merged wherein the results for a repeated measures analysis have to be adjusted for the presence of covariates. The paper concludes with analyses on the results of egg production responses from an experiment in which repeated measurements were made on the same hens and in which an unanticipated temperature covariate was present. PMID:23960136
Department of Statistics STATISTICS COLLOQUIUM
Department of Statistics STATISTICS COLLOQUIUM CRISTOPHER MOORE Santa Fe Institute Message an accessible explanation of this algorithm, including its connections with the cavity method of statistical physics. I will also discuss a phase transition in the detectability of communities, that we conjectured
Brennand, Tracy
to be eligible. James Wong writes: "The truth is, I was not born to think that my dream or my future career statistical interpretation; - promotes the highest possible standards for statistical education and practice
Rago, Angela; Latagliata, Roberto; Montanaro, Marco; Montefusco, Enrico; Andriani, Alessandro; Crescenzi, Sabrina Leonetti; Mecarocci, Sergio; Spirito, Francesca; Spadea, Antonio; Recine, Umberto; Cicconi, Laura; Avvisati, Giuseppe; Cedrone, Michele; Breccia, Massimo; Porrini, Raffaele; Villivà, Nicoletta; De Gregoris, Cinzia; Alimena, Giuliana; D'Arcangelo, Enzo; Guglielmelli, Paola; Lo-Coco, Francesco; Vannucchi, Alessandro; Cimino, Giuseppe
2015-03-01
To predict leukemic transformation (LT), we evaluated easily detectable diagnostic parameters in 338 patients with primary myelofibrosis (PMF) followed in the Latium region (Italy) between 1981 and 2010. Forty patients (11.8%) progressed to leukemia, with a resulting 10-year leukemia-free survival (LFS) rates of 72%. Hb (<10g/dL), and circulating blasts (?1%) were the only two independent prognostic for LT at the multivariate analysis. Two hundred-fifty patients with both the two parameters available were grouped as follows: low risk (none or one factor)=216 patients; high risk (both factors)=31 patients. The median LFS times were 269 and 45 months for the low and high-risk groups, respectively (P<.0001). The LT predictive power of these two parameters was confirmed in an external series of 270 PMF patients from Tuscany, in whom the median LFS was not reached and 61 months for the low and high risk groups, respectively (P<.0001). These results establish anemia and circulating blasts, two easily and universally available parameters, as strong predictors of LT in PMF and may help to improve prognostic stratification of these patients particularly in countries with low resources where more sophisticated molecular testing is unavailable. PMID:25636356
Interpreting Paleoenvironments with Microfossils
NSDL National Science Digital Library
Stephen Culver
This activity is constructed to help students gain a better understanding of how scientists can use foraminifera to interpret past environments. Specifically, they will have the opportunity to understand one of the basic tenets of geology: the present is the key to the past, a principle otherwise known as uniformitarianism. Objectives include: distinguishing between planktonic, benthic, hyaline, porcelaneous, and agglutinated foraminifera, calculating the proportion of planktonic specimens in a sample, establishing the species diversity of a sample, establishing the shell-type ratio of a sample, and reconstructing the environment of deposition of the sample.
Interpretation of extragalactic jets
Norman, M.L.
1985-01-01
The nature of extragalatic radio jets is modeled. The basic hypothesis of these models is that extragalatic jets are outflows of matter which can be described within the framework of fluid dynamics and that the outflows are essentially continuous. The discussion is limited to the interpretation of large-scale (i.e., kiloparsec-scale) jets. The central problem is to infer the physical parameters of the jets from observed distributions of total and polarized intensity and angle of polarization as a function of frequency. 60 refs., 6 figs.
[Interpreting medical devices].
Bulygin, V P; Chepa?kin, A G
2002-01-01
The specific properties of softwares for interpretation of physiological signals are discussed. Problems in the presentation of inaccurate knowledge and in the assessment of the quality of software realization of this knowledge are identified. Ways of describing inaccuracies by using the factors of certainty, inaccurate logic, linguistic approximations, and three-value logic are considered. Using the well-known approaches to testing the ECG and external respiration analyzing systems as examples, the authors show it necessary to develop special methods to evaluate the efficiency, selectivity, and stability of inferences for slight-magnitude conclusions. PMID:12063790
LACIE analyst interpretation keys
NASA Technical Reports Server (NTRS)
Baron, J. G.; Payne, R. W.; Palmer, W. F. (principal investigators)
1979-01-01
Two interpretation aids, 'The Image Analysis Guide for Wheat/Small Grains Inventories' and 'The United States and Canadian Great Plains Regional Keys', were developed during LACIE phase 2 and implemented during phase 3 in order to provide analysts with a better understanding of the expected ranges in color variation of signatures for individual biostages and of the temporal sequences of LANDSAT signatures. The keys were tested using operational LACIE data, and the results demonstrate that their use provides improved labeling accuracy in all analyst experience groupings, in all geographic areas within the U.S. Great Plains, and during all periods of crop development.
An intentional interpretive perspective
Neuman, Paul
2004-01-01
To the extent that the concept of intention has been addressed within behavior analysis, descriptions of intention have been general and have not specifically included important distinctions that differentiate a behavior-analytic approach from vernacular definitions of intention. A fundamental difference between a behavior-analytic approach and most other psychological approaches is that other approaches focus on the necessity of intentions to explain behavior, whereas a behavior-analytic approach is directed at understanding the interplay between behavior and environment. Behavior-analytic interpretations include the relations between the observer's behavior and the environment. From a behavior-analytic perspective, an analysis of the observer's interpretations of an individual's behavior is inherent in the subsequent attribution of intention. The present agenda is to provide a behavior-analytic account of attributing intention that identifies the establishing conditions for speaking of intention. Also addressed is the extent to which we speak of intentions when the observed individual's behavior is contingency shaped or under instructional control. PMID:22478417
Statistics and Unsupervised Learning December 17th 2010
Tanner, Jared
Statistics and Unsupervised Learning Nema Dean December 17th 2010 #12;Why study Statistics? Information boom: need some way to interpret and present data Statistics allows us to do this and allows for the quantification of uncertainty Statistics qualification very employable One of the few areas showing an increase
NSDL National Science Digital Library
Kirkman, Thomas
This collection of calculators, created by Thomas Kirkman of the College of Saint Bendict/Saint Joseph, allows users to perform a number of statistical applications. Each provides background on the procedure and an example. Users can compute descriptive statistics and perform t-tests, Chi-square tests, Kolmogorov-Smirnov tests, Fisher's Exact Test, contingency tables, ANOVA, and regression. This is a nice collection of useful applications for a statistics classroom.
Generic interpreters and microprocessor verification
NASA Technical Reports Server (NTRS)
Windley, Phillip J.
1990-01-01
The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.
GB 311 130 Business Statistics Spring 2014
Diestel, Geoff
, and apply knowledge while working within a mastery based pedagogical approach (Hawkes Learning Systems, #12 basic statistical graphs and charts, calculating and interpreting measures of central tendency applications to process improvement by creating and interpreting control charts. The student will meet
GB 311 125 Business Statistics Summer 2014
Diestel, Geoff
;2 apply knowledge while working within a mastery based pedagogical approach (Hawkes Learning Systems, n basic statistical graphs and charts, calculating and interpreting measures of central tendency applications to process improvement by creating and interpreting control charts. The student will meet
GB 311 110 Business Statistics Online Section
Diestel, Geoff
;2 apply knowledge while working within a mastery based pedagogical approach (Hawkes Learning Systems, n basic statistical graphs and charts, calculating and interpreting measures of central tendency applications to process improvement by creating and interpreting control charts. The student will meet
GB 311 125 Business Statistics Spring 2014
Diestel, Geoff
, and apply knowledge while working within a mastery based pedagogical approach (Hawkes Learning Systems, #12 basic statistical graphs and charts, calculating and interpreting measures of central tendency applications to process improvement by creating and interpreting control charts. The student will meet
GB 311 120 Business Statistics Spring 2014
Diestel, Geoff
, and apply knowledge while working within a mastery based pedagogical approach (Hawkes Learning Systems, #12 basic statistical graphs and charts, calculating and interpreting measures of central tendency applications to process improvement by creating and interpreting control charts. The student will meet
Fit Indices Versus Test Statistics
ERIC Educational Resources Information Center
Yuan, Ke-Hai
2005-01-01
Model evaluation is one of the most important aspects of structural equation modeling (SEM). Many model fit indices have been developed. It is not an exaggeration to say that nearly every publication using the SEM methodology has reported at least one fit index. Most fit indices are defined through test statistics. Studies and interpretation of…
A History of Oral Interpretation.
ERIC Educational Resources Information Center
Bahn, Eugene; Bahn, Margaret L.
This historical account of the oral interpretation of literature establishes a chain of events comprehending 25 centuries of verbal tradition from the Homeric Age through 20th Century America. It deals in each era with the viewpoints and contributions of major historical figures to oral interpretation, as well as with oral interpretation's…
NASA Astrophysics Data System (ADS)
Biass, Sébastien; Frischknecht, Corine; Dell'Oro, Luca; Senegas, Olivier; Bonadonna, Costanza
2010-05-01
In order to answer the needs of contingency planning, we present a GIS-based method for risk assessment of tephra deposits, which is flexible enough to work with datasets of variable precision and resolution depending on data availabilty. Due to the constant increase of population density around volcanoes and the large dispersal of tephra from volcanic plumes, a wide range of threats such as roof collapses, destruction of crops, blockage of vital lifelines and health problems concern even remote communities. In the field of disaster management, there is a general agreement that a global and incomplete method, subject to revision and improvements, is better than no information at all. In this framework, our method is able to provide fast and rough insights on possible eruptive scenarios and their potential consequences on surrounding populations with only few available data, which can easily be refined later. Therefore, the knowledge of both the expected hazard (frequency and magnitude) and the vulnerability of elements at risk are required by planners in order to produce efficient emergency planning prior to a crisis. The Cotopaxi volcano, one of Ecuador's most active volcanoes, was used to develop and test this method. Cotopaxi volcano is located 60 km south of Quito and threatens a highly populated valley. Based on field data, historical reports and the Smithsonian catalogue, our hazard assessment was carried out using the numerical model TEPHRA2. We first applied a deterministic approach that evolved towards a fully probabilistic method in order to account for the most likely eruptive scenarios as well as the variability of atmospheric conditions. In parallel, we carried out a vulnerability assessment of the physical (crops and roofs), social (populations) and systemic elements-at-risk by using mainly free and easily accessible data. Both hazard and vulnerability assessments were compiled with GIS tools to draw comprehensive and tangible thematic risk maps, providing thus the first necessary step for efficient preparedness plannings.
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Wang, Q. J.
2015-03-01
The Australian Bureau of Meteorology produces statistical and dynamic seasonal streamflow forecasts. The statistical and dynamic forecasts are similarly reliable in ensemble spread; however, skill varies by catchment and season. Therefore, it may be possible to optimize forecasting skill by weighting and merging statistical and dynamic forecasts. Two model averaging methods are evaluated for merging forecasts for 12 locations. The first method, Bayesian model averaging (BMA), applies averaging to forecast probability densities (and thus cumulative probabilities) for a given forecast variable value. The second method, quantile model averaging (QMA), applies averaging to forecast variable values (quantiles) for a given cumulative probability (quantile fraction). BMA and QMA are found to perform similarly in terms of overall skill scores and reliability in ensemble spread. Both methods improve forecast skill across catchments and seasons. However, when both the statistical and dynamical forecasting approaches are skillful but produce, on special occasions, very different event forecasts, the BMA merged forecasts for these events can have unusually wide and bimodal distributions. In contrast, the distributions of the QMA merged forecasts for these events are narrower, unimodal and generally more smoothly shaped, and are potentially more easily communicated to and interpreted by the forecast users. Such special occasions are found to be rare. However, every forecast counts in an operational service, and therefore the occasional contrast in merged forecasts between the two methods may be more significant than the indifference shown by the overall skill and reliability performance.
INTRODUCTION TO STATISTICAL DESIGN OF EXPERIMENTS
Alpay, S. Pamir
INTRODUCTION TO STATISTICAL DESIGN OF EXPERIMENTS An IMS Associates Program Short Course October 16 how to analyze the results both analytically and graphically. We will also consider using experiments sequentially so that prior experiments can be easily combined with current results. How to handle hard
Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
Spinless Quantum Field Theory and Interpretation
Dong-Sheng Wang
2013-03-07
Quantum field theory is mostly known as the most advanced and well-developed theory in physics, which combines quantum mechanics and special relativity consistently. In this work, we study the spinless quantum field theory, namely the Klein-Gordon equation, and we find that there exists a Dirac form of this equation which predicts the existence of spinless fermion. For its understanding, we start from the interpretation of quantum field based on the concept of quantum scope, we also extract new meanings of wave-particle duality and quantum statistics. The existence of spinless fermion is consistent with spin-statistics theorem and also supersymmetry, and it leads to several new kinds of interactions among elementary particles. Our work contributes to the study of spinless quantum field theory and could have implications for the case of higher spin.
Screencast Tutorials Enhance Student Learning of Statistics
ERIC Educational Resources Information Center
Lloyd, Steven A.; Robertson, Chuck L.
2012-01-01
Although the use of computer-assisted instruction has rapidly increased, there is little empirical research evaluating these technologies, specifically within the context of teaching statistics. The authors assessed the effect of screencast tutorials on learning outcomes, including statistical knowledge, application, and interpretation. Students…
INCREASING SCIENTIFIC POWER WITH STATISTICAL POWER
A brief survey of basic ideas in statistical power analysis demonstrates the advantages and ease of using power analysis throughout the design, analysis, and interpretation of research. he power of a statistical test is the probability of rejecting the null hypothesis of the test...
Earthquake slip distribution: A statistical model
Yan Y. Kagan
2005-01-01
The purpose of this paper is to interpret slip statistics in a framework of extended earthquake sources. We first discuss deformation pattern of the Earth's surface from earthquakes and suggest that continuum versus block motion controversy can be reconciled by a model of the fractal distribution of seismic sources. We consider earthquake slip statistical distributions as they can be inferred
ABSTRACT: Total Petroleum hydrocarbons (TPH) as a lumped parameter can be easily and rapidly measured or monitored. Despite interpretational problems, it has become an accepted regulatory benchmark used widely to evaluate the extent of petroleum product contamination. Three cu...
An interpretation of banded magnetospheric radio emissions
NASA Astrophysics Data System (ADS)
Benson, R. F.; Osherovich, V. A.; Fainberg, J.; Vinas, A.-F.; Ruppert, D. R.
2001-07-01
Recently published Active Magnetospheric Particle Tracer Explorers/Ion Release Module (AMPTE/IRM) banded magnetospheric emissions, commonly referred to as ``(n+1/2)fce'' emissions, where fce is the electron gyrofrequency, are analyzed by treating them as analogous to sounder-stimulated ionospheric emissions. We show that both individual spectra of magnetospheric banded emissions and a statistically derived spectra observed over the 2-year lifetime of the mission can be interpreted in a self-consistent manner. The analysis, which predicts all spectral peaks within 4% of the observed peaks, interprets the higher-frequency emissions as due to low group velocity Bernstein-mode waves and interprets the lower-frequency emissions as eigenmodes of cylindrical-electromagnetic plasma oscillations. The demarcation between these two classes of emissions is the electron plasma frequency fpe where an emission is often observed. This fpe emission is not necessarily the strongest. None of the observed banded emissions were attributed to the upper hybrid frequency. We present Alouette 2 and ISIS 1 plasma resonance data and model electron temperature (Te) values to support the argument that the frequency spectrum of ionospheric sounder-stimulated emissions is not strongly temperature dependent and thus that the interpretation of these emissions in the ionosphere is relevant to other plasmas (such as the magnetosphere) where Ne and Te can be quite different but where the ratio fpe/fce is identical. The Ne values deduced from the spectral interpretation do not agree with the values determined from the AMPTE/IRM three-dimensional plasma instrument. The latter, which represent a lower bound, are found to be higher than the former by a factor of 3.2-3.5. All values were less than 1 cm-3, a domain known for measurement difficulties. One possible explanation is that the wave and plasma techniques respond to different components of a non-Maxwellian magneto-spheric electron distribution.
Novel Reordering Approaches in Phrase-Based Statistical Machine Translation
Stephan Kanthak; David Vilar; Evgeny Matusov; Richard Zens; Hermann Ney
2005-01-01
This paper presents novel approaches to reordering in phrase-based statistical ma- chine translation. We perform consistent reordering of source sentences in train- ing and estimate a statistical translation model. Using this model, we follow a phrase-based monotonic machine transla- tion approach, for which we develop an ef- ficient and flexible reordering framework that allows to easily introduce different re- ordering
NASA Astrophysics Data System (ADS)
Chang, Chia-Yuan; Ke, Bo-Ting; Su, Hung-Wei; Yen, Wei-Chung; Chen, Shean-Jen
2013-09-01
In this paper, an easily implementable adaptive optics system (AOS) based on a real-time field programmable gate array (FPGA) platform with state-space multichannel control programmed by LabVIEW has been developed, and also integrated into a laser focusing system successfully. To meet the requirements of simple programming configuration and easy integration with other devices, the FPGA-based AOS introduces a standard operation procedure including AOS identification, computation, and operation. The overall system with a 32-channel driving signal for a deformable mirror (DM) as input and a Zernike polynomial via a lab-made Shack-Hartmann wavefront sensor (SHWS) as output is optimally identified to construct a multichannel state-space model off-line. In real-time operation, the FPGA platform first calculates the Zernike polynomial of the optical wavefront measured from the SHWS as the feedback signal. Then, a state-space multichannel controller according to the feedback signal and the identified model is designed and implemented in the FPGA to drive the DM for phase distortion compensation. The current FPGA-based AOS is capable of suppressing low-frequency thermal disturbances with a steady-state phase error of less than 0.1 ? within less than 10 time steps when the control loop is operated at a frequency of 30 Hz.
NSDL National Science Digital Library
Started in 1997, the Badan Pusat Statistik (BPS-Statistics Indonesia) is a non-departmental Indonesian government institution directly responsible to the Indonesian president. As the law that created this valuable institution stipulates, the BPS is intended to provide data to the government and the public, along cooperating with other international statistical institutions. Visitors looking for statistics on any number of topics will not be disappointed, as the areas covered include agriculture, consumer price indices, employment, energy, foreign trade, mining, population, public finance, tourism, and social welfare. Additionally, there are monthly macro-economic statistical reports for the years from 1998 to 2001 that can be downloaded and viewed as well. The site is rounded out by a collection of some 21 papers from the past four years that analyze various economic data from the country, such as earning data and manufacturing production.
NSDL National Science Digital Library
Dudley, Richard
Created by Richard Dudley of the Massachusetts Institute of Technology, this lesson, Mathematical Statistics, is a graduate-level course featuring book chapters and sections presented as lecture notes, problem sets, exams, and a description for an optional term-paper. The course covers: decision theory, estimation, confidence intervals, hypothesis testing, asymptotic efficiency of estimates, exponential families, sequential analysis, and large sample theory. This is a comprehensive overview of this upper level statistics course.
Linking numbers, spin, and statistics of solitons
NASA Technical Reports Server (NTRS)
Wilczek, F.; Zee, A.
1983-01-01
The spin and statistics of solitons in the (2 + 1)- and (3 + 1)-dimensional nonlinear sigma models is considered. For the (2 + 1)-dimensional case, there is the possibility of fractional spin and exotic statistics; for 3 + 1 dimensions, the usual spin-statistics relation is demonstrated. The linking-number interpretation of the Hopf invariant and the use of suspension considerably simplify the analysis.
Fundamentals of interpretation in echocardiography
Harrigan, P.; Lee, R.M.
1985-01-01
This illustrated book provides familiarity with the many clinical, physical, and electronic factors that bear on echocardiographic interpretation. Physical and clinical principles are integrated with considerations of anatomy and physiology to address interpretive problems. This approach yields, for example, sections on the physics and electronics of M-mode, cross sectional, and Doppler systems which are informal, full of echocardiagrams, virtually devoid of mathematics, and rigorously related to common issues faced by echocardiograph interpreters.
Students' Interpretation of a Function Associated with a Real-Life Problem from Its Graph
ERIC Educational Resources Information Center
Mahir, Nevin
2010-01-01
The properties of a function such as limit, continuity, derivative, growth, or concavity can be determined more easily from its graph than by doing any algebraic operation. For this reason, it is important for students of mathematics to interpret some of the properties of a function from its graph. In this study, we investigated the competence of…
Automated, computer interpreted radioimmunoassay results
Hill, J.C.; Nagle, C.E.; Dworkin, H.J.; Fink-Bennett, D.; Freitas, J.E.; Wetzel, R.; Sawyer, N.; Ferry, D.; Hershberger, D.
1984-01-01
90,000 Radioimmunoassay results have been interpreted and transcribed automatically using software developed for use on a Hewlett Packard Model 1000 mini-computer system with conventional dot matrix printers. The computer program correlates the results of a combination of assays, interprets them and prints a report ready for physician review and signature within minutes of completion of the assay. The authors designed and wrote a computer program to query their patient data base for radioassay laboratory results and to produce a computer generated interpretation of these results using an algorithm that produces normal and abnormal interpretives. Their laboratory assays 50,000 patient samples each year using 28 different radioassays. Of these 85% have been interpreted using our computer program. Allowances are made for drug and patient history and individualized reports are generated with regard to the patients age and sex. Finalization of reports is still subject to change by the nuclear physician at the time of final review. Automated, computerized interpretations have realized cost savings through reduced personnel and personnel time and provided uniformity of the interpretations among the five physicians. Prior to computerization of interpretations, all radioassay results had to be dictated and reviewed for signing by one of the resident or staff physicians. Turn around times for reports prior to the automated computer program generally were two to three days. Whereas, the computerized interpret system allows reports to generally be issued the day assays are completed.
Components of Simultaneous Interpreting: Comparing Interpreting with Shadowing and Paraphrasing
ERIC Educational Resources Information Center
Christoffels, Ingrid K.; de Groot, Annette M. B.
2004-01-01
Simultaneous interpreting is a complex task where the interpreter is routinely involved in comprehending, translating and producing language at the same time. This study assessed two components that are likely to be major sources of complexity in SI: The simultaneity of comprehension and production, and transformation of the input. Furthermore,…
Design and the Interpretation of Psychological Research Psychology 530
Hopfinger, Joseph B.
. Statistics for the Behavioral Sciences (Psychology 210 or 215); and 2. Laboratory Research in PsychologyDesign and the Interpretation of Psychological Research Psychology 530 A.T. Panter General Description This course covers both theory and applied methods in psychological research, and in particular
An interpretation of the nitrogen deficiency in comets
Nicolas Iro; Daniel Gautier; Franck Hersant; Dominique Bockelée-Morvan; Jonathan I. Lunine
2003-01-01
We propose an interpretation of the composition of volatiles observed in comets based on their trapping in the form of clathrate hydrates in the solar nebula. The formation of clathrates is calculated from the statistical thermodynamics of Lunine and Stevenson (1985, Astrophys. J. Suppl. 58, 493–531), and occurs in an evolutionary turbulent solar nebula described by the model of Hersant
Automatic Interpretation and Coding of Face Images Using Flexible Models
Andreas Lanitis; Christopher J. Taylor; Timothy F. Cootes
1997-01-01
Face images are difficult to interpret because they are highly variable. Sources of variability include individual appearance, 3D pose, facial expression , and lighting. We describe a compact parametrized model of facial appearance which takes into account all these sources of variability. The model represents both shape and gray-level appearance , and is created by performing a statistical analysis over
DeMaio, Joe
, analyzing, interpreting, etc. Statistics also referes to an actual collection of data or the values problems of interpretation. Problem 1 Is the best team in baseball (or football, etc.) the team that wins are qualitative variables. 4 Exercises 1. For the players on the 1992 Dream Tea
Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J Eloy; Barsukov, Pavel; Bárta, Ji?í; Capek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Santr??ková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas
2014-08-01
Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM ("priming effect"). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze-thaw processes) to additions of (13)C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant productivity, can change the decomposition of SOM stored in deeper layers of permafrost soils, with possible repercussions on the global climate. PMID:25089062
Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J. Eloy; Barsukov, Pavel; Bárta, Ji?í; ?apek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Šantr??ková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas
2014-01-01
Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM (“priming effect”). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze–thaw processes) to additions of 13C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant productivity, can change the decomposition of SOM stored in deeper layers of permafrost soils, with possible repercussions on the global climate. PMID:25089062
Foolad, Mahsa; Ong, Say Leong; Hu, Jiangyong
2015-11-01
Pharmaceutical and personal care products (PPCPs) and artificial sweeteners (ASs) are emerging organic contaminants (EOCs) in the aquatic environment. The presence of PPCPs and ASs in water bodies has an ecologic potential risk and health concern. Therefore, it is needed to detect the pollution sources by understanding the transport behavior of sewage molecular markers in a subsurface area. The aim of this study was to evaluate transport of nine selected molecular markers through saturated soil column experiments. The selected sewage molecular markers in this study were six PPCPs including acetaminophen (ACT), carbamazepine (CBZ), caffeine (CF), crotamiton (CTMT), diethyltoluamide (DEET), salicylic acid (SA) and three ASs including acesulfame (ACF), cyclamate (CYC), and saccharine (SAC). Results confirmed that ACF, CBZ, CTMT, CYC and SAC were suitable to be used as sewage molecular markers since they were almost stable against sorption and biodegradation process during soil column experiments. In contrast, transport of ACT, CF and DEET were limited by both sorption and biodegradation processes and 100% removal efficiency was achieved in the biotic column. Moreover, in this study the effect of different acetate concentration (0-100mg/L) as an easily biodegradable primary substrate on a removal of PPCPs and ASs was also studied. Results showed a negative correlation (r(2)>0.75) between the removal of some selected sewage chemical markers including ACF, CF, ACT, CYC, SAC and acetate concentration. CTMT also decreased with the addition of acetate, but increasing acetate concentration did not affect on its removal. CBZ and DEET removal were not dependent on the presence of acetate. PMID:26210019
Philosophical perspectives on quantum chaos: Models and interpretations
NASA Astrophysics Data System (ADS)
Bokulich, Alisa Nicole
2001-09-01
The problem of quantum chaos is a special case of the larger problem of understanding how the classical world emerges from quantum mechanics. While we have learned that chaos is pervasive in classical systems, it appears to be almost entirely absent in quantum systems. The aim of this dissertation is to determine what implications the interpretation of quantum mechanics has for attempts to explain the emergence of classical chaos. There are three interpretations of quantum mechanics that have set out programs for solving the problem of quantum chaos: the standard interpretation, the statistical interpretation, and the deBroglie-Bohm causal interpretation. One of the main conclusions of this dissertation is that an interpretation alone is insufficient for solving the problem of quantum chaos and that the phenomenon of decoherence must be taken into account. Although a completely satisfactory solution of the problem of quantum chaos is still outstanding, I argue that the deBroglie-Bohm interpretation with the help of decoherence outlines the most promising research program to pursue. In addition to making a contribution to the debate in the philosophy of physics concerning the interpretation of quantum mechanics, this dissertation reveals two important methodological lessons for the philosophy of science. First, issues of reductionism and intertheoretic relations cannot be divorced from questions concerning the interpretation of the theories involved. Not only is the exploration of intertheoretic relations a central part of the articulation and interpretation of an individual theory, but the very terms used to discuss intertheoretic relations, such as `state' and `classical limit', are themselves defined by particular interpretations of the theory. The second lesson that emerges is that, when it comes to characterizing the relationship between classical chaos and quantum mechanics, the traditional approaches to intertheoretic relations, namely reductionism and theoretical pluralism, are inadequate. The fruitful ways in which models have been used in quantum chaos research point to the need for a new framework for addressing intertheoretic relations that focuses on models rather than laws.
NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site]
Released July 22, 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth.
Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms.
It is often difficult to determine if wind eroded surface represent the youngest activity in a region. Wind eroded landforms can be covered by later materials and the exhumed long after they were initially formed. This image illustrates how difficult it can be to interpret the surface of Mars.
Image information: VIS instrument. Latitude -6.7, Longitude 174.7 East (185.3 West). 19 meter/pixel resolution.
Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.
NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.
Analyzing spike trains with circular statistics
NASA Astrophysics Data System (ADS)
Takeshita, Daisuke; Gale, John T.; Montgomery, Erwin B.; Bahar, Sonya; Moss, Frank
2009-05-01
In neuroscience, specifically electrophysiology, it is common to replace a measured sequence of action potentials or spike trains with delta functions prior to analysis. We apply a method called circular statistics to a time series of delta functions and show that the method is equivalent to the power spectrum. This technique allows us to easily visualize the idea of the power spectrum of spike trains and easily reveals oscillatory and stochastic behavior. We provide several illustrations of the method and an example suitable for students, and suggest that the method might be useful for courses in introductory biophysics and neuroscience.
Dream Interpretation in Ancient Civilizations
J. Donald Hughes
2000-01-01
Dream interpretation was regarded by ancient peoples in Mesopotamia, Egypt, Greece, and Rome as an art requiring intelligence and, sometimes, divine inspiration. It became a motif in literature. It was treated as a science by philosophers and physicians. Dreams were thought to come either as clear messages, or as symbols requiring interpretation. In a method called incubation, the dreamer could
Interpretation Tasks for Grammar Teaching.
ERIC Educational Resources Information Center
Ellis, Rod
1995-01-01
The traditional approach to grammar teaching provides learners with opportunities to produce specific grammatical structures. This article explores an alternative approach, one based on interpreting input. The rationale for the approach is discussed, as are the principles for designing interpretation tasks for grammar teaching. (Contains 35…
Remote sensing and image interpretation
NASA Technical Reports Server (NTRS)
Lillesand, T. M.; Kiefer, R. W. (principal investigators)
1979-01-01
A textbook prepared primarily for use in introductory courses in remote sensing is presented. Topics covered include concepts and foundations of remote sensing; elements of photographic systems; introduction to airphoto interpretation; airphoto interpretation for terrain evaluation; photogrammetry; radiometric characteristics of aerial photographs; aerial thermography; multispectral scanning and spectral pattern recognition; microwave sensing; and remote sensing from space.
Museum Docents' Understanding of Interpretation
ERIC Educational Resources Information Center
Neill, Amanda C.
2010-01-01
The purpose of this qualitative research study was to explore docents' perceptions of their interpretive role in art museums and determine how those perceptions shape docents' practice. The objective was to better understand how docents conceive of their role and what shapes the interpretation they give on tours to the public. The conceptual…
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
Statistical modeling of SAR images: a survey.
Gao, Gui
2010-01-01
Statistical modeling is essential to SAR (Synthetic Aperture Radar) image interpretation. It aims to describe SAR images through statistical methods and reveal the characteristics of these images. Moreover, statistical modeling can provide a technical support for a comprehensive understanding of terrain scattering mechanism, which helps to develop algorithms for effective image interpretation and creditable image simulation. Numerous statistical models have been developed to describe SAR image data, and the purpose of this paper is to categorize and evaluate these models. We first summarize the development history and the current researching state of statistical modeling, then different SAR image models developed from the product model are mainly discussed in detail. Relevant issues are also discussed. Several promising directions for future research are concluded at last. PMID:22315568
Statistical Modeling of SAR Images: A Survey
Gao, Gui
2010-01-01
Statistical modeling is essential to SAR (Synthetic Aperture Radar) image interpretation. It aims to describe SAR images through statistical methods and reveal the characteristics of these images. Moreover, statistical modeling can provide a technical support for a comprehensive understanding of terrain scattering mechanism, which helps to develop algorithms for effective image interpretation and creditable image simulation. Numerous statistical models have been developed to describe SAR image data, and the purpose of this paper is to categorize and evaluate these models. We first summarize the development history and the current researching state of statistical modeling, then different SAR image models developed from the product model are mainly discussed in detail. Relevant issues are also discussed. Several promising directions for future research are concluded at last. PMID:22315568
ERIC Educational Resources Information Center
Catley, Alan
2007-01-01
Following the announcement last year that there will be no more math coursework assessment at General Certificate of Secondary Education (GCSE), teachers will in the future be able to devote more time to preparing learners for formal examinations. One of the key things that the author has learned when teaching statistics is that it makes for far…
NSDL National Science Digital Library
Anderson-Cook, C.
This is a collection of applets regarding various topics in statistics. Topics include central limit theorem, probability distributions, hypothesis testing, power, confidence intervals, correlation, control charts, experimental design, data analysis, and regression. Each topic has a description page and links to one or more applets.
NSDL National Science Digital Library
Annis, Charles
A good resource for problems in statistics in engineering. Contains some applets, and good textual examples related to engineering. Some topics include Monte Carlo method, Central Limit Theorem, Risk, Logistic Regression, Generalized Linear Models, and Confidence. Overall, this is a well presented and good site for anyone interested in engineering or mathematics.
Variability of Interpretive Accuracy Among Diagnostic Mammography Facilities
Taplin, Stephen H.; Sickles, Edward A.; Abraham, Linn; Barlow, William E.; Carney, Patricia A.; Geller, Berta; Berns, Eric A.; Cutter, Gary R.; Elmore, Joann G.
2009-01-01
Background Interpretive performance of screening mammography varies substantially by facility, but performance of diagnostic interpretation has not been studied. Methods Facilities performing diagnostic mammography within three registries of the Breast Cancer Surveillance Consortium were surveyed about their structure, organization, and interpretive processes. Performance measurements (false-positive rate, sensitivity, and likelihood of cancer among women referred for biopsy [positive predictive value of biopsy recommendation {PPV2}]) from January 1, 1998, through December 31, 2005, were prospectively measured. Logistic regression and receiver operating characteristic (ROC) curve analyses, adjusted for patient and radiologist characteristics, were used to assess the association between facility characteristics and interpretive performance. All statistical tests were two-sided. Results Forty-five of the 53 facilities completed a facility survey (85% response rate), and 32 of the 45 facilities performed diagnostic mammography. The analyses included 28?100 diagnostic mammograms performed as an evaluation of a breast problem, and data were available for 118 radiologists who interpreted diagnostic mammograms at the facilities. Performance measurements demonstrated statistically significant interpretive variability among facilities (sensitivity, P = .006; false-positive rate, P < .001; and PPV2, P < .001) in unadjusted analyses. However, after adjustment for patient and radiologist characteristics, only false-positive rate variation remained statistically significant and facility traits associated with performance measures changed (false-positive rate = 6.5%, 95% confidence interval [CI] = 5.5% to 7.4%; sensitivity = 73.5%, 95% CI = 67.1% to 79.9%; and PPV2 = 33.8%, 95% CI = 29.1% to 38.5%). Facilities reporting that concern about malpractice had moderately or greatly increased diagnostic examination recommendations at the facility had a higher false-positive rate (odds ratio [OR] = 1.48, 95% CI = 1.09 to 2.01) and a non–statistically significantly higher sensitivity (OR = 1.74, 95% CI = 0.94 to 3.23). Facilities offering specialized interventional services had a non–statistically significantly higher false-positive rate (OR = 1.97, 95% CI = 0.94 to 4.1). No characteristics were associated with overall accuracy by ROC curve analyses. Conclusions Variation in diagnostic mammography interpretation exists across facilities. Failure to adjust for patient characteristics when comparing facility performance could lead to erroneous conclusions. Malpractice concerns are associated with interpretive performance. PMID:19470953
Pringle, James "Jamie"
(Consulting(Service( ( Services(Provided:( The$statistical$consulting$service$provides$assistance$with$experimental$design,$data$collection$ design,$data$display$and$statistical$analysis,$and$interpretation$of$findings.$$$ ( UNH$clients,$such$as$to$local/regional$industries$that$request$ assistance$for$experimentation,$clinical$trials$and$data$analysis.$$ ( Fees:( Service$is$free$of$charge$for
Chi-Square Statistics, Tests of Hypothesis and Technology.
ERIC Educational Resources Information Center
Rochowicz, John A.
The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…
Statistical Physics, Mixtures of Distributions, and the EM Algorithm
Alan L. Yuille; Paul E. Stolorz; Joachim Utans
1994-01-01
We show that there are strong relationships between approaches to optmization and learning based on statistical physics or mixtures of experts. In particular, the EM algorithm can be interpreted as converging either to a local maximum of the mixtures model or to a saddle point solution to the statistical physics system. An advantage of the statistical physics approach is that
LASER SYSTEM COMPONENTS: Quasi-resonator — a new interpretation of the scattering in a laser
NASA Astrophysics Data System (ADS)
Yurkin, A. V.
1994-04-01
A new interpretation is proposed for the scattering observed in a laser with a plane-plane resonator. Wave and ray models are used simultaneously. A recursive calculation of the distribution of rays is made by geometric and statistical methods.
Three-Dimensional Statistics of Radio Polarimetry
Mark M. McKinnon
2003-06-03
The measurement of radio polarization may be regarded as a three-dimensional statistical problem because the large photon densities at radio wavelengths allow the simultaneous detection of the three Stokes parameters which completely describe the radiation's polarization. The statistical nature of the problem arises from the fluctuating instrumental noise, and possibly from fluctuations in the radiation's polarization. A statistical model is used to derive the general, three-dimensional statistics that govern radio polarization measurements. The statistics are derived for specific cases of source-intrinsic polarization, with an emphasis on the orthogonal polarization modes in pulsar radio emission. The statistics are similar to those commonly found in other fields of the physical, biological, and Earth sciences. Given the highly variable linear and circular polarization of pulsar radio emission, an understanding of the three-dimensional statistics may be an essential prequisite to a thorough interpretation of pulsar polarization data.
Learning Statistics By Doing Statistics
NSDL National Science Digital Library
Smith, Gary
This article, created by Gary Smith of Pomona College, discusses a project-based approach to teaching statistics. The article focuses on the team aspect of learning, it introduces concepts such as: working with data, learning by doing, learning by writing, learning by speaking, and authentic assessment of material. An appendix contains a list of twenty projects that have been successfully assigned.
ENVIRONMENTAL PHOTOGRAPHIC INTERPRETATION CENTER (EPIC)
The Environmental Sciences Division (ESD) in the National Exposure Research Laboratory (NERL) of the Office of Research and Development provides remote sensing technical support including aerial photograph acquisition and interpretation to the EPA Program Offices, ORD Laboratorie...
Modularity and locality in interpretation
Singh, Raj, Ph. D. Massachusetts Institute of Technology
2008-01-01
This thesis will argue for four broad claims: (1) That local contexts are needed for a descriptively adequate theory of linguistic interpretation, (2) That presupposition accommodation is made with respect to a set of ...
UNCORRECTEDPROOF Interpreting microarray expression data
Shavlik, Jude W.
micro- 18 array experiment. We seek models that are (a) accurate, (b) easy to interpret, and (c) 19 Science Inc. 28 Information Sciences xxx (2002) xxxxxx www.elsevier.com/locate/ins * Corresponding author
Bayesian Interpretation of Test Reliability.
ERIC Educational Resources Information Center
Jones, W. Paul
1991-01-01
A Bayesian alternative to interpretations based on classical reliability theory is presented. Procedures are detailed for calculation of a posterior score and credible interval with joint consideration of item sample and occasion error. (Author/SLD)
Interpreting Quantum Parallelism by Sequents
NASA Astrophysics Data System (ADS)
Battilotti, Giulia
2010-12-01
We introduce an interpretation of quantum superposition in predicative sequent calculus, in the framework of basic logic. Then we introduce a new predicative connective for the entanglement. Our aim is to represent quantum parallelism in terms of logical proofs.
INTERPRETATION OF ENVIRONMENTAL ASSESSMENT DATA
The report describes preliminary attempts to formulate viable models for interpreting environmental assessment data. The models are evaluated using data from the four most comprehensive environmental assessments. A format for entering environmental assessment results on FORTRAN c...
Map Interpretation with Google Earth
NSDL National Science Digital Library
Resources in this collection A highly effective, non-traditional approach for using Google Earth to teach strike, dip, and geologic map interpretation, with assignments and activities (Barbara Tewksbury, Hamilton ...
NSDL National Science Digital Library
NRICH
2013-01-01
In this statistics and probability activity students must determine whether each statement is always true, sometimes true, sometimes false, or always false. Students must have a basic understanding of probability statements and the foundation for understanding mean, median, and mode in order to complete this activity for all twelve statements. In addition to the task, tips for getting started, possible solutions, a teacher resource page, and a printable page are provided.
E. Bogomolny; U. Gerland; C. Schmit
2000-12-04
We consider the statistical distribution of zeros of random meromorphic functions whose poles are independent random variables. It is demonstrated that correlation functions of these zeros can be computed analytically and explicit calculations are performed for the 2-point correlation function. This problem naturally appears in e.g. rank-one perturbation of an integrable Hamiltonian and, in particular, when a $\\delta$-function potential is added to an integrable billiard.
NSDL National Science Digital Library
2013-06-21
Students will encounter the concept of a distribution, along with parameters that describe a distribution's "typical" values (average) and a distribution's spread (variance). To understand simple distributions and uncertainty propagation in the coming sections, it is necessary to be familiar with the concept of statistical independence. When two variables fluctuate independently, their covariance vanishes, and the variance of their sum is the sum of their variances.
This section of the BCSC web site details the information collected and used in research by the BCSC. Statistics includes charts and tables that provide an overview of the data collected. These data are used in a wide range of studies that evaluate the performance of mammography in community settings. Some studies analyze data collected from individual sites; others examine data pooled from two or more sites.
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Nets and Data Flow Interpreters
Alexander Moshe Rabinovich; Boris A. Trakhtenbrot
1989-01-01
The authors investigate and compare two ways of specifying stream relations (in particular, stream functions). The first uses relational programs, i.e., netlike program schemes in which the signature primitives are interpreted as relations over a given CPO. No stream domains are assumed; semantics is in fixed-point style. The second is through data flow nets, i.e., nets whose nodes are interpreted
Calibrated Peer Review for Interpreting Linear Regression Parameters: Results from a Graduate Course
ERIC Educational Resources Information Center
Enders, Felicity B.; Jenkins, Sarah; Hoverman, Verna
2010-01-01
Biostatistics is traditionally a difficult subject for students to learn. While the mathematical aspects are challenging, it can also be demanding for students to learn the exact language to use to correctly interpret statistical results. In particular, correctly interpreting the parameters from linear regression is both a vital tool and a…
An Interpretation of Banded Magnetospheric Radio Emissions
NASA Technical Reports Server (NTRS)
Benson, Robert F.; Osherovich, V. A.; Fainberg, J.; Vinas, A. F.; Ruppert, D. R.; Vondrak, Richard R. (Technical Monitor)
2000-01-01
Recently-published Active Magnetospheric Particle Tracer Explorer/Isothermal Remanent Magnetization (AMPTE/IRM) banded magnetospheric emissions, commonly referred to as '(n + 1/2)f(sub ce)' emissions where f(sub ce) is the electron gyrofrequency, are analyzed by treating them as analogous to sounder-stimulated ionospheric emissions. We show that both individual AMPTE/IRM spectra of magnetospheric banded emissions, and a statistically-derived spectra observed over the two-year lifetime of the mission, can be interpreted in a self-consistent manner. The analysis, which predicts all spectral peaks within 4% of the observed peaks, interprets the higher-frequency emissions as due to low group-velocity Bernstein-mode waves and the lower-frequency emissions as eigen modes of cylindrical-electromagnetic-plasma-oscillations. The demarcation between these two classes of emissions is the electron plasma frequency f(sub pe), where an emission is often observed. This f(sub pe), emission is not necessarily the strongest. None of the observed banded emissions were attributed to the upper-hybrid frequency. We present Alouette-2 and ISIS-1 plasma-resonance data, and model electron temperature (T(sub e)) values, to support the argument that the frequency-spectrum of ionospheric sounder-stimulated emissions is not strongly temperature dependent and thus that the interpretation of these emissions in the ionosphere is relevant to other plasmas (such as the magnetosphere) where N(sub e) and T(sub e) can be quite different but where the ratio f(sub pe)/f(sub ce) is identical.
Statistics and data analysis in geochemical prospecting
Howarth, R.J.
1983-01-01
A source book for geochemical data interpretation. Divided into two parts, the book first provides a series of tutorial papers which give the reader an introduction to methods for data storage and retrieval using the computer; analytical quality control; assessment of variability introduced both by natural variation and laboratory error; single- and multi-element statistical interpretation, and the preparation of geochemical maps. The second part comprises a number of short case histories, particularly related to multi-element treatment, and review papers outlining current activity in geochemical data analysis in many parts of the world. Graphical aids for a number of useful statistical tests are included as an appendix.
Differences in university students' attitudes and anxiety about statistics.
Mji, Andile
2009-06-01
The Statistics Anxiety Rating Scale and the Attitudes Toward Statistics questionnaire were administered to 226 university of technology students. The former scale measures anxiety about learning statistics in terms of Worth of Statistics, Interpretation Anxiety, Test and Class Anxiety, Computational Self-concept, Fear of Asking for Help, and Fear of Statistics Teachers. The latter measures attitudes toward use of statistics and statistics course for which a student was registered. These African students were enrolled in Taxation, Marketing, or Accounting. Participants took a required course in statistics intended to improve statistical skills. There were 150 women and 57 men, chosen because they had no previous mathematics learning. Students' ages ranged between 16 and 26 years (M = 20.1, SD = 2.0). There were no statistically significant sex differences on attitudes and anxiety toward statistics, but there were significant differences among areas of study programs. PMID:19708400
J. Mark Heinzle; Claes Uggla
2012-12-21
In this paper we explore stochastical and statistical properties of so-called recurring spike induced Kasner sequences. Such sequences arise in recurring spike formation, which is needed together with the more familiar BKL scenario to yield a complete description of generic spacelike singularities. In particular we derive a probability distribution for recurring spike induced Kasner sequences, complementing similar available BKL results, which makes comparisons possible. As examples of applications, we derive results for so-called large and small curvature phases and the Hubble-normalized Weyl scalar.
Spin, Statistics, and Reflections, II. Lorentz Invariance
Bernd Kuckert; Reinhard Lorenzen
2005-12-21
The analysis of the relation between modular P$_1$CT-symmetry -- a consequence of the Unruh effect -- and Pauli's spin-statistics relation is continued. The result in the predecessor to this article is extended to the Lorentz symmetric situation. A model $\\G_L$ of the universal covering $\\widetilde{L_+^\\uparrow}\\cong SL(2,\\complex)$ of the restricted Lorentz group $L_+^\\uparrow$ is modelled as a reflection group at the classical level. Based on this picture, a representation of $\\G_L$ is constructed from pairs of modular P$_1$CT-conjugations, and this representation can easily be verified to satisfy the spin-statistics relation.
College Students' Interpretation of Research Reports on Group Differences: The Tall-Tale Effect
ERIC Educational Resources Information Center
Hogan, Thomas P.; Zaboski, Brian A.; Perry, Tiffany R.
2015-01-01
How does the student untrained in advanced statistics interpret results of research that reports a group difference? In two studies, statistically untrained college students were presented with abstracts or professional associations' reports and asked for estimates of scores obtained by the original participants in the studies. These estimates…
12 CFR 907.5 - Regulatory Interpretations.
Code of Federal Regulations, 2014 CFR
2014-01-01
...2014-01-01 2014-01-01 false Regulatory Interpretations. 907.5 Section 907...Waivers, Approvals, No-Action Letters, and Regulatory Interpretations § 907.5 Regulatory Interpretations. (a) Authority....
12 CFR 907.5 - Regulatory Interpretations.
Code of Federal Regulations, 2011 CFR
2011-01-01
...2011-01-01 2011-01-01 false Regulatory Interpretations. 907.5 Section 907...Waivers, Approvals, No-Action Letters, and Regulatory Interpretations § 907.5 Regulatory Interpretations. (a) Authority....
12 CFR 907.5 - Regulatory Interpretations.
Code of Federal Regulations, 2012 CFR
2012-01-01
...2012-01-01 2012-01-01 false Regulatory Interpretations. 907.5 Section 907...Waivers, Approvals, No-Action Letters, and Regulatory Interpretations § 907.5 Regulatory Interpretations. (a) Authority....
12 CFR 907.5 - Regulatory Interpretations.
Code of Federal Regulations, 2013 CFR
2013-01-01
...2013-01-01 2013-01-01 false Regulatory Interpretations. 907.5 Section 907...Waivers, Approvals, No-Action Letters, and Regulatory Interpretations § 907.5 Regulatory Interpretations. (a) Authority....
Using Playing Cards to Differentiate Probability Interpretations
ERIC Educational Resources Information Center
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Default Sarcastic Interpretations: On the Priority of Nonsalient Interpretations
ERIC Educational Resources Information Center
Giora, Rachel; Drucker, Ari; Fein, Ofer; Mendelson, Itamar
2015-01-01
Findings from five experiments support the view that negation generates sarcastic utterance-interpretations by default. When presented in isolation, novel negative constructions ("Punctuality is not his forte," "Thoroughness is not her most distinctive feature"), free of semantic anomaly or internal incongruity, were…
Interpretational Confounding or Confounded Interpretations of Causal Indicators?
ERIC Educational Resources Information Center
Bainter, Sierra A.; Bollen, Kenneth A.
2014-01-01
In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning…
The Interpretive Approach to Religious Education: Challenging Thompson's Interpretation
ERIC Educational Resources Information Center
Jackson, Robert
2012-01-01
In a recent book chapter, Matthew Thompson makes some criticisms of my work, including the interpretive approach to religious education and the research and activity of Warwick Religions and Education Research Unit. Against the background of a discussion of religious education in the public sphere, my response challenges Thompson's account,…
Applications of Statistical Tests in Hand Surgery
Song, Jae W.; Haas, Ann; Chung, Kevin C.
2015-01-01
During the nineteenth century, with the emergence of public health as a goal to improve hygiene and conditions of the poor, statistics established itself as a distinct scientific field important for critically interpreting studies of public health concerns. During the twentieth century, statistics began to evolve mathematically and methodologically with hypothesis testing and experimental design. Today, the design of medical experiments centers around clinical trials and observational studies, and with the use of statistics, the collected data are summarized, weighed, and presented to direct both physicians and the public towards Evidence-Based Medicine. Having a basic understanding of statistics is mandatory in evaluating the validity of published literature and applying it to patient care. In this review, we aim to apply a practical approach in discussing basic statistical tests by providing a guide to choosing the correct statistical test along with examples relevant to hand surgery research. PMID:19969193
Statistical Methods for Material Characterization and Qualification
Kercher, A.K.
2005-04-01
This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.
Reversible Object-Oriented Interpreters
NASA Astrophysics Data System (ADS)
Lieberman, Henry
The "programs are data" philosophy of Lisp uses Lisp's S-expressions to represent programs, and permits a program written in Lisp itself to implement the interpreter for the language. Object-oriented languages can take this one step further: we can use objects to represent programs. and an object-oriented interpreter takes the form of responses to a protocol of messages for evaluating programs. Because objects are a richer data structure than simple S-expressions, the representation of programs can have more built-in intelligence than was feasible using simple list manipulation alone.
[Interpretation of liver function tests].
Kim, Yoon Jun
2008-04-01
Liver function tests (LFT) are helpful screening tools to detect hepatic dysfunction. LFT are further used to categorize hepatic dysfunctions, to estimate the severity of hepatic disease, and for the follow-up of liver diseases. Since liver performs a variety of functions, no single test is sufficient alone to provide complete estimate of function of liver. Effective interpretation of the hepatic function panel requires knowledge of underlying pathophysiology and the characteristics of panel tests. This review includes a classification of liver diseases, which are commonly detected by routine LFT, a list of liver functions with appropriate tests for each function, and a guide to panel interpretation and further laboratory investigation. PMID:18516000
Interpreting Observations of Io Plasma Torus Variation
NASA Astrophysics Data System (ADS)
Herbert, F.
2003-05-01
A useful but hitherto little-used approach for understanding the chemical, energetic, and transport dynamics of the Io plasma torus is time series analysis of torus spectral datasets. Jovian system periodicities (Jovian rotation, Io orbit, ``System IV'', and their beats and harmonics) and randomly varying output from Io's volcanic atmosphere provide fluctuating input to the torus. Each torus component responds to this forcing with its own response time, so that in principle we can observationally constrain torus properties by comparing such responses in repetitive observations of torus spectra. Several such datasets exist, notably those of EUVE, IUE, Voyager UVS, and Cassini UVIS, although their relatively low SNR makes this use challenging. In addition, a Small Explorer mission (JMEX) is being proposed that would greatly expand the dataset applicable to this approach. To disentangle the random inputs from the deterministic dynamics (the part we're interested in), we can estimate statistics of the data such as the Fourier power spectra and cross-correlations of the ion abundances. Such statistics average out the random part of the fluctuations, leaving the interaction network signatures which can be interpreted by computing the corresponding statistics from models. For example, if neutral cloud S is concentrated near Io, its ionization by torus electrons should produce a ˜6.5 hr periodicity (the interval between torus midplane crossings of Io) in the power spectrum of S+, as is observed. Another example: S+++ and O++ abundances should co-vary with fluctuations in the supply of superthermal electrons. The scope for and results of further analysis of this type will be discussed. This work was made possible by NASA grants NAG5-12944 and 9079 (Geospace Sciences) and NAG5-8952 (Planetary Atmospheres).
Hideaki Tei; Akiko Miyazaki; Makoto Iwata; Mikio Osawa; Yoko Nagata; Shoichi Maruyama
1997-01-01
We conducted a neuropsychological study comparing early-stage Alzheimer's disease (AD; n = 22) and multiple subcortical infarction with mild cognitive impairment (MSI; n = 22) using an easily applicable test battery which included 8 tests. Two groups were matched for age, education and score on the Mini-Mental State Examination. Patients with AD had significantly lower scores than MSI patients in
Capdeboscq, Yves
#12;Problem No image is displayed when connecting the data projector to a Windows laptop Solution This generally easily fixed with a simple key combination on the laptop. The laptop will have a Fn key - this is not the same as a Function Key (eg F1, F2 etc). The Fn key is used for additional functionality on the laptop
Capdeboscq, Yves
Problem No image is displayed when connecting the data projector to a Windows laptop Solution This generally easily fixed with a simple key combination on the laptop. The laptop will have a Fn key - this is not the same as a Function Key (eg F1, F2 etc). The Fn key is used for additional functionality on the laptop
Kossin, James P.
As a calibrated and mapped geostationary satellite dataset, GridSat is easily accessible for bothU.S.weathersatellite,questions on how to best use historical satellite data come to the forefront of the discussion on climate observation. There are now over 30 years of globally sampled geostationary and polar satellite data, and while
Kohut's method of interpretation: a critique.
Rubovits-Seitz, P
1988-01-01
To cope with the obscure, complexly overdetermined, and unstable nature of unconscious meanings, Freud developed a pluralistic methodology that employs a wide variety of interpretive strategies and procedures. Conversely, Kohut proposed a radically abbreviated interpretive approach based on the single, subjective method of empathy. This report reevaluates Kohut's monistic interpretive methodology: (1) The principal features of Kohut's interpretive method are reviewed and evaluated. (2) Case material and interpretations from Kohut's final book are used to compare his unidimensional approach with the pluralistic methodology of traditional interpretation. (3) The epistemologic liabilities of Kohut's interpretive method are delineated and discussed. (4) Methodologically more appropriate strategies for improving clinical interpretation are presented. PMID:3069888
Statistical Methods for Imaging System Neal H. Clinthorne,
Fessler, Jeffrey A.
Radiotracer in Patient Emission Tomograph Image Reconstruction Human Interpretation Radiation Source Detector injected into patient · SPECT or PET cameras · Scintillation cameras · Radiation detectors · etcStatistical Methods for Imaging System Design Neal H. Clinthorne, Division of Nuclear Medicine
Classical interpretations of relativistic precessions
NASA Astrophysics Data System (ADS)
Hajra, Sankar
2014-03-01
Relativists have exposed various precessions and developed ingenious experiments to verify those phenomena with extreme precisions. The Gravity Probe B mission was designed to study the precessions of the gyroscopes rotating round the Earth in a nearly circular near-Earth polar orbit to demonstrate the geodetic effect and the Lense-Thirring effect as predicted by the general relativity theory. In this paper, we show in a very simple and novel analysis that the precession of the perihelion of Mercury, the Thomas precession, and the precession data (on the de Sitter and Lense-Thirring precessions) collected from the Gravity Probe B mission could easily be explained from classical physics, too.
Classical interpretations of relativistic precessions
NASA Astrophysics Data System (ADS)
Sankar, Hajra
2014-04-01
Relativists have exposed various precessions and developed ingenious experiments to verify those phenomena with extreme precisions. The Gravity Probe B mission was designed to study the precessions of the gyroscopes rotating round the Earth in a nearly circular near-Earth polar orbit to demonstrate the geodetic effect and the Lense-Thirring effect as predicted by the general relativity theory. In this paper, we show in a very simple and novel analysis that the precession of the perihelion of Mercury, the Thomas precession, and the precession data (on the de Sitter and Lense-Thirring precessions) collected from the Gravity Probe B mission could easily be explained from classical physics, too.
A Unified Approach to Coding and Interpreting Face Images
Andreas Lanitis; Christopher J. Taylor; Timothy F. Cootes
1995-01-01
Face images are difficult to interpret because they are highlyvariable. Sources of variability include individual appear#ance, 3D pose, facial expression and lighting. We describe acompact parametrised model of facial appearance whichtakes into account all these sources of variability. The modelrepresents both shape and grey-level appearance and iscreated by performing a statistical analysis over a trainingset of face images. A robust
Interpreting Sky-Averaged 21-cm Measurements
NASA Astrophysics Data System (ADS)
Mirocha, Jordan
2015-01-01
Within the first ~billion years after the Big Bang, the intergalactic medium (IGM) underwent a remarkable transformation, from a uniform sea of cold neutral hydrogen gas to a fully ionized, metal-enriched plasma. Three milestones during this epoch of reionization -- the emergence of the first stars, black holes (BHs), and full-fledged galaxies -- are expected to manifest themselves as extrema in sky-averaged ("global") measurements of the redshifted 21-cm background. However, interpreting these measurements will be complicated by the presence of strong foregrounds and non-trivialities in the radiative transfer (RT) modeling required to make robust predictions.I have developed numerical models that efficiently solve the frequency-dependent radiative transfer equation, which has led to two advances in studies of the global 21-cm signal. First, frequency-dependent solutions facilitate studies of how the global 21-cm signal may be used to constrain the detailed spectral properties of the first stars, BHs, and galaxies, rather than just the timing of their formation. And second, the speed of these calculations allows one to search vast expanses of a currently unconstrained parameter space, while simultaneously characterizing the degeneracies between parameters of interest. I find principally that (1) physical properties of the IGM, such as its temperature and ionization state, can be constrained robustly from observations of the global 21-cm signal without invoking models for the astrophysical sources themselves, (2) translating IGM properties to galaxy properties is challenging, in large part due to frequency-dependent effects. For instance, evolution in the characteristic spectrum of accreting BHs can modify the 21-cm absorption signal at levels accessible to first generation instruments, but could easily be confused with evolution in the X-ray luminosity star-formation rate relation. Finally, (3) the independent constraints most likely to aide in the interpretation of global 21-cm signal measurements are detections of Lyman Alpha Emitters at high redshifts and constraints on the midpoint of reionization, both of which are among the primary science objectives of ongoing or near-future experiments.
The Armenian Genocide: An Interpretation.
ERIC Educational Resources Information Center
Astourian, Stephan
1990-01-01
Presents an interpretive study of the Armenian genocide of 1915 based on Israel Charny's societal-forces model. Argues genocides follow a pattern of long discriminatory relationships between a dominant and a dominated group. Cites the economic achievements of dominated groups as the basis. Shows the global pattern of genocide. (NL)
Interpreting Remote Sensing NOx Measurements
Denver, University of
Interpreting Remote Sensing NOx Measurements Robert Slott, Consultant, Donald Stedman and Saj tailpipe emissions (HC, CO, NOx) are changing with time hUse remote sensing hMeasurements in at least 4 of the year at each location hUniform QC/QA and data reporting Paper # 2001-01-3640 #12;Remote Sensing
Recent Trends in Oral Interpretation.
ERIC Educational Resources Information Center
Armstrong, Chloe
1974-01-01
The field of oral interpretation has been influenced by both the analytical approach to literature study, with significant emphasis on understanding the literary text, and the interpersonal approach. While oral reading may utilize various performance arts or media such as dance, music, or film, the most popular movement currently is Readers…
Interpretation Dr. Michael D. Coble
://www.cstl.nist.gov/biotech/strbase/mixture.htm NIJ Grant to Boston University funded ~150 state & local lab analysts to attend Catherine Grgicak) National recommendations of the technical UK DNA working group on mixture interpretation for the NDNAD and for court going purposes. FSI Genetics 2(1): 7682. · Schneider, P.M., et al. (2009) The German Stain
EKG Interpretation Program. Trainers Manual.
ERIC Educational Resources Information Center
Webb, Sandra M.
This trainer's manual is designed to assist nursing instructors assigned to advanced medical surgical nursing courses in teaching students how to make basic interpretations of their patients' electrocardiographic (EKG) strips. Included in the manual are pre- and posttests and instructional units dealing with the following topics: EKG indicators,…
Game interpretation of Kolmogorov complexity
Andrej A. MuchnikIlya Mezhirov; Ilya Mezhirov; Alexander Shen; Nikolai K. Vereshchagin
2010-01-01
The Kolmogorov complexity function K can be relativized using any oracle A, and most properties of K remain true for relativized versions KA. We provide an explanation for this observation by giving a game-theoretic interpretation and showing that all \\
Smartberries: Interpreting Erdrich's Love Medicine
ERIC Educational Resources Information Center
Treuer, David
2005-01-01
The structure of "Love Medicines" interpreted by Hertha D. Sweet Wong who claims that the book's "multiple narrators confound conventional Western expectations of an autonomous protagonist, a dominant narrative voice, and a consistently chronological narrative". "Love Medicine" is a brilliant use of the Western literary tactics that create the…
Sonar image interpretation and modelling
G. T. Russell; Judith M. Bell; P. O. Holt; S. J. Clarke
1996-01-01
The problem addressed is that of providing a model-based system to interpret sonar data from an autonomous or remotely controlled vehicle instrumentation, to aid navigation, improve sea floor mapping techniques, identify objects, simulate acoustic images for survey and analysis, or to aid the design of sonar systems. Analytic tools have been developed to segment and classify sea-bed and shallow seismic
Use and Interpretation of Radar
NSDL National Science Digital Library
John Nielsen-Gammon
1996-01-01
This undergraduate meteorology tutorial from Texas A&M University discusses the basic principles of operation of weather radars, describes how to interpret radar mosaics, and discusses the use of radar in weather forecasting. Students learn the relationship between range and elevation and how to use radar images and mosaics in short-range forecasting.
ERIC Educational Resources Information Center
Melton, T. R.
A computer-assisted instruction system, called IT1 (Interpretive Tutor), is described which is intended to assist a student's efforts to learn the content of textual material and to evaluate his efforts toward that goal. The text is represented internally in the form of semantic networks with auxiliary structures which relate network nodes to…
Interpretive Reproduction in Children's Play
ERIC Educational Resources Information Center
Corsaro, William A.
2012-01-01
The author looks at children's play from the perspective of interpretive reproduction, emphasizing the way children create their own unique peer cultures, which he defines as a set of routines, artifacts, values, and concerns that children engage in with their playmates. The article focuses on two types of routines in the peer culture of preschool…
Intuitionistic Implication in Abstract Interpretation
Giacobazzi, Roberto
reduced product and disjunctive completion have both a clean and immediate logical interpretation@di:unipi:it Ph.: +3950887283 Fax: +3950887226 Abstract. In this paper we introduce the notion of Heyting reduced product ([14]) of two domains is the most abstract domain which is approximated by both the given
Interpretive Error in Psychological Testing.
ERIC Educational Resources Information Center
Hummel, Thomas J.; Lichtenberg, James W.
When evaluating the utility of a psychological test for clinical decision making, both the psychometric properties of the test (i.e., the reliability and validity of the instrument) and the ambiguity of the language by which test results are interpreted or communicated need to be considered. Although each has been studied independently, to date…
Probabilistic Interpretation of Resonant States
Hatano, Naomichi; Feinberg, Joshua
2009-01-01
After reviewing the definition of resonant states as eigenstates of the Schroedinger equation, we introduce a physical view of resonant states. In particular, we introduce a probabilistic interpretation of resonant states by counting the particle number in a expanding region of space.
Statistical methods suitable for the analysis of plant tissue culture data
Michael E. Compton
1994-01-01
Statistical analyses are an essential part of biological research. Statistical methods are available to biological researchers that range from very simple to extremely complex. Therefore, caution should be used when selecting a statistical method. When possible it is best to avoid complicated statistical procedures that are difficult to interpret and may hinder the researcher's ability to make treatment comparisons. Instead
... Act and Program National Statistics (MQSA) MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... Scorecard, we present the most commonly requested national statistics regarding the MQSA program. These statistics are updated ...
Mammography Facility Characteristics Associated With Interpretive Accuracy of Screening Mammography
Abraham, Linn; Barlow, William E.; Fenton, Joshua J.; Berns, Eric A.; Carney, Patricia A.; Cutter, Gary R.; Sickles, Edward A.; Carl, D'Orsi; Elmore, Joann G.
2008-01-01
Background Although interpretive performance varies substantially among radiologists, such variation has not been examined among mammography facilities. Understanding sources of facility variation could become a foundation for improving interpretive performance. Methods In this cross-sectional study conducted between 1996 and 2002, we surveyed 53 facilities to evaluate associations between facility structure, interpretive process characteristics, and interpretive performance of screening mammography (ie, sensitivity, specificity, positive predictive value [PPV1], and the likelihood of cancer among women who were referred for biopsy [PPV2]). Measures of interpretive performance were ascertained prospectively from mammography interpretations and cancer data collected by the Breast Cancer Surveillance Consortium. Logistic regression and receiver operating characteristic (ROC) curve analyses estimated the association between facility characteristics and mammography interpretive performance or accuracy (area under the ROC curve [AUC]). All P values were two-sided. Results Of the 53 eligible facilities, data on 44 could be analyzed. These 44 facilities accounted for 484?463 screening mammograms performed on 237?669 women, of whom 2686 were diagnosed with breast cancer during follow-up. Among the 44 facilities, mean sensitivity was 79.6% (95% confidence interval [CI] = 74.3% to 84.9%), mean specificity was 90.2% (95% CI = 88.3% to 92.0%), mean PPV1 was 4.1% (95% CI = 3.5% to 4.7%), and mean PPV2 was 38.8% (95% CI = 32.6% to 45.0%). The facilities varied statistically significantly in specificity (P < .001), PPV1 (P < .001), and PPV2 (P = .002) but not in sensitivity (P = .99). AUC was higher among facilities that offered screening mammograms alone vs those that offered screening and diagnostic mammograms (0.943 vs 0.911, P = .006), had a breast imaging specialist interpreting mammograms vs not (0.932 vs 0.905, P = .004), did not perform double reading vs independent double reading vs consensus double reading (0.925 vs 0.915 vs 0.887, P = .034), or conducted audit reviews two or more times per year vs annually vs at an unknown frequency (0.929 vs 0.904 vs 0.900, P = .018). Conclusion Mammography interpretive performance varies statistically significantly by facility. PMID:18544742
ERIC Educational Resources Information Center
Singamsetti, Rao
2007-01-01
In this paper an attempt is made to highlight some issues of interpretation of statistical concepts and interpretation of results as taught in undergraduate Business statistics courses. The use of modern technology in the class room is shown to have increased the efficiency and the ease of learning and teaching in statistics. The importance of…
Statistics 5126 Introduction to Applied Statistics
Barbu, Adrian
-0218 Office: 308 BIO UNIT 1 Office Hours: T/R 3:00-4:00pm Textbook: Probability and Statistics With R, Ugarte to confidently carry out statistical analyses using the R system. Prerequisite: One of STA 2122, 2171, 3032, 4322Statistics 5126 Introduction to Applied Statistics Fall 2012 Course Information Class Meeting Time
PSIS: 59007000 Frequencies and Statistical Comparisons
Kavanagh, Karen L.
in the Frequencies and Statistical Comparisons report that are important to keep in mind when interpreting your oversamples and other non-randomly selected students are not included. Item numbers: Item numbering on the instrument. Count and column percentage (%): The Count column contains the number of students who selected
EXPERIMENTAL DESIGN: STATISTICAL CONSIDERATIONS AND ANALYSIS
Technology Transfer Automated Retrieval System (TEKTRAN)
In this book chapter, information on how field experiments in invertebrate pathology are designed and the data collected, analyzed, and interpreted is presented. The practical and statistical issues that need to be considered and the rationale and assumptions behind different designs or procedures ...
Towards a Pedagogic Theory of Interpreting: Learning to Interpret, or Interpreting to Learn?
ERIC Educational Resources Information Center
Pollock, Richard
Interpreting, or oral translation, can be used in undergraduate second language study as a technique for developing oral skills. The technique can produce both sound linguistic judgment and confident oral and written performance that last beyond testing. It integrates four essential elements of oral/aural skills: comprehension, phonology/fluency,…
University Interpreting: Linguistic Issues for Consideration.
ERIC Educational Resources Information Center
Napier, Jemina
2002-01-01
A study investigated 10 Auslan/English interpreters' use of translation style when interpreting for a university lecture. Results found the interpreters predominantly used a free or literal interpretation approach, but switched between translation styles at particular points of a text, leading to the suggestion of the concept of translational…
Interpreting Recoil for Undergraduate Students
NASA Astrophysics Data System (ADS)
Elsayed, Tarek A.
2012-04-01
The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is closely related to Newton's third law. Since the actual microscopic causes of recoil differ from one problem to another, some students (and teachers) may not be satisfied with understanding recoil through the principles of conservation of linear momentum and Newton's third law. For these students, the origin of the recoil motion should be presented in more depth.
Inuit interpretations of sleep paralysis.
Law, Samuel; Kirmayer, Laurence J
2005-03-01
Traditional and contemporary Inuit concepts of sleep paralysis were investigated through interviews with elders and young people in Iqaluit, Baffin Island. Sleep paralysis was readily recognized by most respondents and termed uqumangirniq (in the Baffin region) or aqtuqsinniq (Kivalliq region). Traditional interpretations of uqumangirniq referred to a shamanistic cosmology in which the individual's soul was vulnerable during sleep and dreaming. Sleep paralysis could result from attack by shamans or malevolent spirits. Understanding the experience as a manifestation of supernatural power, beyond one's control, served to reinforce the experiential reality and presence of the spirit world. For contemporary youth, sleep paralysis was interpreted in terms of multiple frameworks that incorporated personal, medical, mystical, traditional/shamanistic, and Christian views, reflecting the dynamic social changes taking place in this region. PMID:15881270
Phonological Interpretation into Preordered Algebras
NASA Astrophysics Data System (ADS)
Kubota, Yusuke; Pollard, Carl
We propose a novel architecture for categorial grammar that clarifies the relationship between semantically relevant combinatoric reasoning and semantically inert reasoning that only affects surface-oriented phonological form. To this end, we employ a level of structured phonology that mediates between syntax (abstract combinatorics) and phonology proper (strings). To notate structured phonologies, we employ a lambda calculus analogous to the ?-terms of [8]. However, unlike Oehrle's purely equational ?-calculus, our phonological calculus is inequational, in a way that is strongly analogous to the functional programming language LCF [10]. Like LCF, our phonological terms are interpreted into a Henkin frame of posets, with degree of definedness ('height' in the preorder that interprets the base type) corresponding to degree of pronounceability; only maximal elements are actual strings and therefore fully pronounceable. We illustrate with an analysis (also new) of some complex constituent-order phenomena in Japanese.
An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics
ERIC Educational Resources Information Center
Ellis, Frank B.; Ellis, David C.
2008-01-01
Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…
NSDL National Science Digital Library
Dexter Perkins
This short problem set works well as a group activity that can be completed in class. The purpose of the exercise is for students to begin to think about T-X phase diagrams and how they are interpreted. Along the way, students learn that text book authors sometimes make mistakes. The figure in the handout is from Winter's Petrology. But, Winter goofed and left some reactions off of the phase diagram.
Operational interpretations of quantum discord
Cavalcanti, D.; Modi, K.; Aolita, L.; Boixo, S.; Piani, M.; Winter, A.
2011-03-15
Quantum discord quantifies nonclassical correlations beyond the standard classification of quantum states into entangled and unentangled. Although it has received considerable attention, it still lacks any precise interpretation in terms of some protocol in which quantum features are relevant. Here we give quantum discord its first information-theoretic operational meaning in terms of entanglement consumption in an extended quantum-state-merging protocol. We further relate the asymmetry of quantum discord with the performance imbalance in quantum state merging and dense coding.
Interpreted Applications within BOINC Infrastructure
Fernandez, Thomas
Interpreted Applications within BOINC Infrastructure Daniel Lombra~na Gonz´alez1 , Francisco Fern´andez de Vega1 , L. Trujillo2 , G. Olague2 , M. C´ardenas3 , L. Araujo4 , P. Castillo5 , K. Sharman6 , A@cicese.mx 3 Ceta-Ciemat miguel.cardenas@ciemat.es 4 UNED lurdes@lsi.uned.es 5 University of Granada pedro
Interpreting neurodynamics: concepts and facts
Harald Atmanspacher; Stefan Rotter
2008-01-01
The dynamics of neuronal systems, briefly neurodynamics, has developed into an attractive and influential research branch\\u000a within neuroscience. In this paper, we discuss a number of conceptual issues in neurodynamics that are important for an appropriate\\u000a interpretation and evaluation of its results. We demonstrate their relevance for selected topics of theoretical and empirical\\u000a work. In particular, we refer to the
To estimate vapor pressure easily
Yaws, C.L.; Yang, H.C. )
1989-10-01
Vapor pressures as functions of temperature for approximately 700 major organic chemical compounds are given. The tabulation also gives the temperature range for which the data are applicable. Minimum and maximum temperatures are denoted by TMIN and TMAX. The Antoine equation that correlates vapor pressure as a function of temperature is described. A representative comparison of calculated and actual data values for vapor pressure is shown for ethyl alcohol. The coefficient tabulation is based on both literature (experimental data) and estimated values.
ERIC Educational Resources Information Center
Stanly, Pat
2009-01-01
Rough patches occur at both ends of the education pipeline, as students enter community colleges and move on to work or enrollment in four-year institutions. Career pathways--sequences of coherent, articulated, and rigorous career and academic courses that lead to an industry-recognized certificate or a college degree--are a promising approach to…
Attiya, Hagit
Clearly, fi 0 :p j Â¸ fi. It can easily be shown that p j distinguishes between fi and fi 0 , and therefore, fi 6= fi 0 . We now show that fi 0 is unique. If B k 6= fp j g (Case (1)), then clearly, there is another processor p i 2 B k . If B k = fp j g (Case (2)), then there is another processor p i 2 B k+1 : B
Consistent interpretations of quantum mechanics
Omnes, R. )
1992-04-01
Within the last decade, significant progress has been made towards a consistent and complete reformulation of the Copenhagen interpretation (an interpretation consisting in a formulation of the experimental aspects of physics in terms of the basic formalism; it is consistent if free from internal contradiction and complete if it provides precise predictions for all experiments). The main steps involved decoherence (the transition from linear superpositions of macroscopic states to a mixing), Griffiths histories describing the evolution of quantum properties, a convenient logical structure for dealing with histories, and also some progress in semiclassical physics, which was made possible by new methods. The main outcome is a theory of phenomena, viz., the classically meaningful properties of a macroscopic system. It shows in particular how and when determinism is valid. This theory can be used to give a deductive form to measurement theory, which now covers some cases that were initially devised as counterexamples against the Copenhagen interpretation. These theories are described, together with their applications to some key experiments and some of their consequences concerning epistemology.
ISIS - Gulf interactive interpretation system
Caldwell, J.H.; Valuikas, P.S.; Platek, R.M.; Casavant, R.R.
1984-04-01
For 2 years, Gulf Exploration and Production Co. has successfully used a powerful computer system for interactive graphic interpretation of large and diverse volumes of exploration data. This proprietary system, developed by Gulf Research and Development Co., is called ISIS (Interactive Seismic Interpretation System). Some of the capabilities of ISIS are demonstrated using videotape recordings of 3 actual interpretation sessions. The first session comprises interactive log analysis--editing formation evaluation, and tying between wells. The second session involves regional mapping from a large data base of seismic lines and well logs. Numerous access and display features allow projects exceeding 20,000 line-mi (32,000 line-km) to be instantly available at the interactive station, replacing large volumes of paper records. Horizons can be carried around loops and tied, then posted and contoured automatically. The third session demonstrates detailed reservoir characterization at a mature field. Over 225 digitized well logs are gridded and then analyzed using interactive graphic software originally developed for 3D seismic surveys.
Comparison of a Novel Computerized Analysis Program and Visual Interpretation of Cardiotocography
Chen, Chen-Yu; Yu, Chun; Chang, Chia-Chen; Lin, Chii-Wann
2014-01-01
Objective To compare a novel computerized analysis program with visual cardiotocography (CTG) interpretation results. Methods Sixty-two intrapartum CTG tracings with 20- to 30-minute sections were independently interpreted using a novel computerized analysis program, as well as the visual interpretations of eight obstetricians, to evaluate the baseline fetal heart rate (FHR), baseline FHR variability, number of accelerations, number/type of decelerations, uterine contraction (UC) frequency, and the National Institute of Child Health and Human Development (NICHD) 3-Tier FHR classification system. Results There was no significant difference in interobserver variation after adding the components of computerized analysis to results from the obstetricians' visual interpretations, with excellent agreement for the baseline FHR (ICC 0.91), the number of accelerations (ICC 0.85), UC frequency (ICC 0.97), and NICHD category I (kappa statistic 0.91); good agreement for baseline variability (kappa statistic 0.68), the numbers of early decelerations (ICC 0.78) and late decelerations (ICC 0.67), category II (kappa statistic 0.78), and overall categories (kappa statistic 0.80); and moderate agreement for the number of variable decelerations (ICC 0.60), and category III (kappa statistic 0.50). Conclusions This computerized analysis program is not inferior to visual interpretation, may improve interobserver variations, and could play a vital role in prenatal telemedicine. PMID:25437442
Interpretation of a compositional time series
NASA Astrophysics Data System (ADS)
Tolosana-Delgado, R.; van den Boogaart, K. G.
2012-04-01
Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA. In this data set, the proportion of annual precipitation falling in winter, spring, summer and autumn is considered a 4-component time series. Three invertible log-ratios are defined for calculations, balancing rainfall in autumn vs. winter, in summer vs. spring, and in autumn-winter vs. spring-summer. Results suggest a 2-year correlation range, and certain oscillatory behaviour in the last balance, which does not occur in the other two.
Evaluation of Computer Simulated Baseline Statistics for Use in Item Bias Studies. [Revised].
ERIC Educational Resources Information Center
Rogers, H. Jane; Hambleton, Ronald K.
Although item bias statistics are widely recommended for use in test development and test analysis work, problems arise in their interpretation. The purpose of the present research was to evaluate the validity of logistic test models and computer simulation methods for providing a frame of reference for item bias statistic interpretations.…
Assessment of Materials for Engaging Students in Statistical Discovery
ERIC Educational Resources Information Center
Froelich, Amy G.; Duckworth, William M.
2008-01-01
As part of an NSF funded project we developed new course materials for a general introductory statistics course designed to engage students in statistical discovery. The materials were designed to actively involve students in the design and implementation of data collection and the analysis and interpretation of the resulting data. Our overall…
The Effect Size Statistic: Overview of Various Choices.
ERIC Educational Resources Information Center
Mahadevan, Lakshmi
Over the years, methodologists have been recommending that researchers use magnitude of effect estimates in result interpretation to highlight the distinction between statistical and practical significance (cf. R. Kirk, 1996). A magnitude of effect statistic (i.e., effect size) tells to what degree the dependent variable can be controlled,…
Faculty Salary Equity Cases: Combining Statistics with the Law
ERIC Educational Resources Information Center
Luna, Andrew L.
2006-01-01
Researchers have used many statistical models to determine whether an institution's faculty pay structure is equitable, with varying degrees of success. Little attention, however, has been given to court interpretations of statistical significance or to what variables courts have acknowledged should be used in an equity model. This article…
The Insignificance of Statistical Significance Testing Douglas H. Johnson
Steury, Todd D.
- example, the American Psychological Associa- search hypothesis,whch is what the investigator tion such as The Jotirnul of\\Vildlfe ,\\/lanagernent, statistical hypothesis tests add very little value to the products statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I
ALISE Library and Information Science Education Statistical Report, 1999.
ERIC Educational Resources Information Center
Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.
This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by…
UNSUPERVISED MINING OF STATISTICAL TEMPORAL STRUCTURES IN VIDEO
Lexing Xie; Shih-Fu Chang; Ajay Divakaran; Huifang Sun
2003-01-01
In this paper, we present algorithms for unsupervised mining of struc- tures in video using multi-scale statistical models. Video structure are repetitive segments in a video stream with consistent statistical charac- teristics. Such structures can often be interpreted in relation to distinc- tive semantics, particularly in structured domains like sports. While much work in the literature explores the link between
Interpreting Central Surface Brightness and Color Profiles in Elliptical Galaxies
NASA Astrophysics Data System (ADS)
Silva, David R.; Wise, Michael W.
1996-01-01
Hubble Space Telescope imagery has revealed dust features in the central regions of many (50%--80%) nearby bright elliptical galaxies. If these features are an indication of an underlying smooth diffuse dust distribution, then the interpretation of central surface brightness and color profiles in elliptical galaxies becomes significantly more difficult. In this Letter, diagnostics for constraining the presence of such an underlying central dust distribution are presented. We show that easily detectable central color gradients and flattened central surface brightness profiles can be induced by even small amounts of smoothly distributed dust (~100 M?). Conversely, combinations of flat surface brightness profiles and flat color gradients or steep surface brightness profiles and steep color gradients are unlikely to be caused by dust. Taken as a whole, these results provide a simple observational tautology for constraining the existence of smooth diffuse dust distributions in the central regions of elliptical galaxies.
Glaciation of northwestern Wyoming interpreted from ERTS-1
NASA Technical Reports Server (NTRS)
Breckenridge, R. M.
1973-01-01
Analysis of ERTS Imagery has shown a number of alpine glacial features can be recognized and mapped successfully. Although the Wyoming mountains are generally regarded as the type locality for Rocky Mountain glaciation some areas have not been studied from a glacial standpoint because of inaccessibility or lack of topographic control. ERTS imagery provides an excellent base for this type of regional geomorphic study. A map of maximum extent of Wisconsin Ice, flow directions and major glacial features was compiled from interpretation of the ERTS imagery. Features which can be mapped are large moraines, outwash fans and terraces. Present-day glaciers and snowfields are easily discriminated and mapped. Glaciers and glacial deposits which serve as aquifers play a significant role in the hydrologic cycle and are important because of the increasing demand placed on our water resources. ERTS provides a quick and effective method for change detection and inventory of these vital resources.
The wetland continuum: a conceptual framework for interpreting biological studies
Euliss, N.H., Jr.; LaBaugh, J.W.; Fredrickson, L.H.; Mushet, D.M.; Swanson, G.A.; Winter, T.C.; Rosenberry, D.O.; Nelson, R.D.
2004-01-01
We describe a conceptual model, the wetland continuum, which allows wetland managers, scientists, and ecologists to consider simultaneously the influence of climate and hydrologic setting on wetland biological communities. Although multidimensional, the wetland continuum is most easily represented as a two-dimensional gradient, with ground water and atmospheric water constituting the horizontal and vertical axis, respectively. By locating the position of a wetland on both axes of the continuum, the potential biological expression of the wetland can be predicted at any point in time. The model provides a framework useful in the organization and interpretation of biological data from wetlands by incorporating the dynamic changes these systems undergo as a result of normal climatic variation rather than placing them into static categories common to many wetland classification systems. While we developed this model from the literature available for depressional wetlands in the prairie pothole region of North America, we believe the concept has application to wetlands in many other geographic locations.
Statistical Project Work: The Pretoria Experience
NSDL National Science Digital Library
Schoeman, H.S.
This article, created by H.S. Schoeman and A.G.W. Steyn of the University of Pretoria, defines the importance, organization, and evaluation of undergraduate project work. This lesson plan in congruent with the authors teaching, they state: "Since it is the ability of the statistician to apply his subject in practice which makes him a statistician rather than a mathematician, and the ability to cope with real world situations develops from professional experience and a maturity of outlook, the inclusion of project work as part of the statistics curriculum is seen as important by the Department of Statistics at the University of Pretoria." This is an excellent introduction into the curriculum of different statistics programs around the country. This could easily be implemented in portions of other programs.
ERIC Educational Resources Information Center
van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan
2011-01-01
The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of the Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives…
Computer generation of nuclear spin species and nuclear spin statistical weights
Balasubramanian, K.
1982-01-01
This article develops computer programs for computer generation of nuclear spin species and nuclear spin statistical weights of rovibronic levels. The programs developed here generate nuclear spin species and statistical weights from the group structures known as generalized character cycle indices (GCCIs) which are computed easily from the character table of the PI group of the molecule under consideration. Procedures are illustrated with examples.
ERROR MODELS FOR LIGHT SENSORS BY STATISTICAL ANALYSIS OF RAW SENSOR MEASUREMENTS
Potkonjak, Miodrag
silicon solar cell that converts light impulses directly into electrical charges that can easilyERROR MODELS FOR LIGHT SENSORS BY STATISTICAL ANALYSIS OF RAW SENSOR MEASUREMENTS F. Koushanfar1-based systems including calibration, sensor fusion and power management. We developed a system of statistical
Statutory Interpretation and Constitutional Legislation
Feldman, David
, University of Cambridge, and Fellow of Downing College, Cambridge. For illuminating insights, advice and criticisms, I am grateful to John Allison, John Bell, Damien Bruneau, Nicky Padfield, Anat Scolnicov, Yvonne Tew, Jason Varuhas, Se-shauna Wheatle... constitution, a mechanism under which laws are to be made, not a mere act which declares what the law is to be.” 7 Anthony Mason, “Trends in constitutional interpretation” (1995) 18 U.N.S.W.L.J. 237-249 at 238. 8 Laurence H. Tribe and Michael C. Dorf...
Interpretations of Elastic Electron Scattering
T. W. Donnelly; D. K. Hasell; R. G. Milner
2015-05-18
Elastic scattering of relativistic electrons from the nucleon yields Lorentz invariant form factors that describe the fundamental distribution of charge and magnetism. The spatial dependence of the nucleon's charge and magnetism is typically interpreted in the Breit reference frame which is related by a Lorentz boost from the laboratory frame, where the nucleon is at rest. We construct a toy model to estimate how the charge and magnetic radii of the nucleon are modified between the Breit and lab. frames. This has implications for the ratio of the proton electric to magnetic elastic form factors as a function of momentum transfer as well as for determinations of the proton charge radius.
The Importance Statistics Education
Utts, Jessica
strategies, and so on GAISE Report #12;Common Core for grades K to 5 Use data displays to ask and answerThe Importance of Statistics Education Professor Jessica Utts Department of Statistics University examples of statistical decisions in daily life why statistics education matters Introducing Statistics
FISHERY STATISTICS UNITED STATES
FISHERY STATISTICS OF THE UNITED STATES 1972 STATISTICAL DIGEST NO. 66 Prepared by STATISTICS;ACKNOWLEDGMENTS The data in this edition of "Fishery Statistics of the United States" were collected in co- operation with the various States and tabulated by the staff of the Statistics and Market News Division
Biostratinomic utility of Archimedes in environmental interpretation
Wulff, J.I. )
1990-04-01
Biostratinomic information from the bryozoan Archimedes can be used to infer paleocurrent senses when other more traditional sedimentary structures are lacking. As with other elongate particles, Archimedes zooaria become oriented in the current and, upon settling, preserve a sense of the flow direction. Orientations and lengths were measured on over 200 individuals from bedding plane exposures in the Upper Mississippian Union Limestone (Greenbrier Group) of West Virginia. These were separated into long and short populations and plotted on rose diagrams. The results show that long and short segments become preferentially oriented in the current and the bimodally distributed long segments can be used to infer the current sense. The current sense is defined by the line which bisects the obtuse angle created by the two maxima in the rose diagram for long segments. Statistical evaluation of the long and short populations indicate they are significant to the 99.9 percent level. Elongate fossils such as Archimedes can be used in paleocurrent evaluations and can add more detail to the interpretation of paleodepositional conditions.
Testing photons' Bose-Einstein statistics with Compton scattering
Altschul, Brett [Department of Physics and Astronomy, University of South Carolina, Columbia, South Carolina 29208 (United States)
2010-11-15
It is an empirical question whether photons always obey Bose-Einstein statistics, but devising and interpreting experimental tests of photon statistics can be a challenge. The nonrelativistic cross section for Compton scattering illustrates how a small admixture {nu} of wrong-sign statistics leads to a loss of gauge invariance; there is a large anomalous amplitude for scattering timelike photons. Nevertheless, one can interpret the observed transparency of the solar wind plasma at low frequencies as a bound {nu}<10{sup -25} if Lorentz symmetry is required. If there is instead a universal preferred frame, the bound is {nu}<10{sup -14}, still strong compared with previous results.
Finding One Variable Statistics With a Graphing Calculator
NSDL National Science Digital Library
2012-08-28
This quick YouTube video from high school statistics teacher Roger W. Davis explains how to find one variable statistics using the TI-84 graphing calculator. The demonstration goes through three steps: entering the data, finding one variable statistics using the STAT menu, and interpreting the results. The data created includes mean, sum, median and more. Flash player is required to view this video, and the running time for the clip is 3:12.
Statistics Poker: Reinforcing Basic Statistical Concepts
ERIC Educational Resources Information Center
Leech, Nancy L.
2008-01-01
Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…
Fluid interpretation of Cardassian expansion
NASA Astrophysics Data System (ADS)
Gondolo, Paolo; Freese, Katherine
2003-09-01
A fluid interpretation of Cardassian expansion is developed. Here, the Friedmann equation takes the form H2=g(?M) where ?M contains only matter and radiation (no vacuum). The function g(?M) returns to the usual 8??M/(3m2pl) during the early history of the Universe, but takes a different form that drives an accelerated expansion after a redshift z˜1. One possible interpretation of this function (and of the right-hand side of Einstein’s equations) is that it describes a fluid with total energy density ?tot=(3m2pl/8?)g(?M)=?M+?K containing not only matter density (mass times number density) but also interaction terms ?K. These interaction terms give rise to an effective negative pressure which drives cosmological acceleration. These interactions may be due to interacting dark matter, e.g. with a fifth force between particles F˜r?-1. Such interactions may be intrinsically four dimensional or may result from higher dimensional physics. A fully relativistic fluid model is developed here, with conservation of energy, momentum, and particle number. A modified Poisson’s equation is derived. A study of fluctuations in the early Universe is presented, although a fully relativistic treatment of the perturbations including gauge choice is as yet incomplete.
Interpretation of the Cosmological Metric
Richard J. Cook; M. Shane Burns
2008-09-03
The cosmological Robertson-Walker metric of general relativity is often said to have the consequences that (1) the recessional velocity $v$ of a galaxy at proper distance $\\ell$ obeys the Hubble law $v=H\\ell$, and therefore galaxies at sufficiently great distance $\\ell$ are receding faster than the speed of light $c$; (2) faster than light recession does not violate special relativity theory because the latter is not applicable to the cosmological problem, and because ``space itself is receding'' faster than $c$ at great distance, and it is velocity relative to local space that is limited by $c$, not the velocity of distant objects relative to nearby ones; (3) we can see galaxies receding faster than the speed of light; and (4) the cosmological redshift is not a Doppler shift, but is due to a stretching of photon wavelength during propagation in an expanding universe. We present a particular Robertson-Walker metric (an empty universe metric) for which a coordinate transformation shows that none of these interpretation necessarily holds. The resulting paradoxes of interpretation lead to a deeper understanding of the meaning of the cosmological metric.
... at NIMH News About Us Home > Health & Education Statistics Prevalence Disability Suicide Cost Global More Prevalence Disability ... those affected receive treatment. The information on these statistics pages includes the best statistics currently available on ...
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2013 CFR
2013-01-01
...Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in...
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2014 CFR
2014-01-01
...Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in...
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2012 CFR
2012-01-01
...Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in...
Harrison's interpretation of the cosmological redshift revisited
Valerio Faraoni
2009-08-24
Harrison's argument against the interpretation of the cosmological redshift as a Doppler effect is revisited, exaggerated, and discussed. The context, purpose, and limitations of the interpretations of this phenomenon are clarified.
Scheme86: A System for Interpreting Scheme
Berlin, Andrew A.
1988-04-01
Scheme86 is a computer system designed to interpret programs written in the Scheme dialect of Lisp. A specialized architecture, coupled with new techniques for optimizing register management in the interpreter, allows ...
10 CFR 20.1006 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
...2010-01-01 false Interpretations. 20.1006 Section 20.1006 Energy NUCLEAR REGULATORY COMMISSION STANDARDS FOR PROTECTION AGAINST RADIATION General Provisions § 20.1006 Interpretations. Except as specifically...
10 CFR 26.7 - Interpretations.
Code of Federal Regulations, 2012 CFR
2012-01-01
...2012-01-01 2012-01-01 false Interpretations. 26.7 Section 26.7 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Administrative Provisions § 26.7 Interpretations. Except as specifically authorized by the...
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
...Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in...
10 CFR 76.6 - Interpretations.
Code of Federal Regulations, 2011 CFR
2011-01-01
...Interpretations. 76.6 Section 76.6 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS General Provisions § 76.6 Interpretations. Except as specifically authorized by the Commission in...
10 CFR 39.5 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
...COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL LOGGING General Provisions § 39.5 Interpretations. Except as specifically authorized by the Commission in writing, no interpretation of the meaning of the regulations in...
10 CFR 36.5 - Interpretations.
Code of Federal Regulations, 2014 CFR
2014-01-01
...2014-01-01 false Interpretations. 36.5 Section 36.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR IRRADIATORS General Provisions § 36.5 Interpretations. Except as specifically...
10 CFR 36.5 - Interpretations.
Code of Federal Regulations, 2013 CFR
2013-01-01
...2013-01-01 false Interpretations. 36.5 Section 36.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR IRRADIATORS General Provisions § 36.5 Interpretations. Except as specifically...
Interpretation and clustering of handwritten student responses
Von Tish, Kelsey Leigh
2012-01-01
This thesis presents an interpretation and clustering framework for handwritten student responses on tablet computers. The ink analysis system is able to capture and interpret digital ink strokes for many types of classroom ...
Department of Statistical Science
Keinan, Alon
Corporate Strategy Manager Senior Data Analyst Statistician Risk Analyst Statistical Modeler Consulting Joslin Diabetes Center LG Electronics Monsanto Risk Management Solutions Samsung Insurance (3) Statistics
Degree-of-Entailment Interpretations of Probability
Fitelson, Branden
Degree-of-Entailment Interpretations of Probability A number of interpretations of probability have been offered which take probability to be a logical relation between an evidence statement (or a set as subjectivistic-particularly by frequency theorists- but there is a class of interpretations of probability which
12 CFR 609.920 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
...CREDIT SYSTEM ELECTRONIC COMMERCE Interpretations and Definitions § 609.920 Interpretations. (a) E-SIGN preempts...requires that statutes and regulations be interpreted to allow E-commerce as long as the safeguards of E-SIGN are met and...
AERIAL PHOTO INTERPRETATION NATIONAL INVENTORY OF LANDSCAPES
MANUAL FOR AERIAL PHOTO INTERPRETATION IN THE NATIONAL INVENTORY OF LANDSCAPES IN SWEDEN NILS YEAR for aerial photo interpretation 1 www-nils.slu.se SLU, Department of Forest Resource Management and Geomatics. 901 83 Umeå, Sweden #12;NILS manual for aerial photo interpretation 2 Table of contents 1 About NILS
Comparison of Two Interpretations of Josephson Effect
I. M. Yurin
2008-09-16
This paper puts forward an interpretation of the Josephson effect based on the Alternative Theory of Superconductivity (ATS). A comparison of ATS- and BCS-based interpretations is provided. It is demonstrated that the ATS-based interpretation, unlike that based on BCS theory, does not require a revision of fundamentals of quantum physics.
Comprehension and Error Monitoring in Simultaneous Interpreters
ERIC Educational Resources Information Center
Yudes, Carolina; Macizo, Pedro; Morales, Luis; Bajo, M. Teresa
2013-01-01
In the current study we explored lexical, syntactic, and semantic processes during text comprehension in English monolinguals and Spanish/English (first language/second language) bilinguals with different experience in interpreting (nontrained bilinguals, interpreting students and professional interpreters). The participants performed an…
An Online Synchronous Test for Professional Interpreters
ERIC Educational Resources Information Center
Chen, Nian-Shing; Ko, Leong
2010-01-01
This article is based on an experiment designed to conduct an interpreting test for multiple candidates online, using web-based synchronous cyber classrooms. The test model was based on the accreditation test for Professional Interpreters produced by the National Accreditation Authority of Translators and Interpreters (NAATI) in Australia.…
Court Interpreting: The Anatomy of a Profession.
ERIC Educational Resources Information Center
de Jongh, Elena M.
For both translators and interpreters, language proficiency is only the starting point for professional work. The equivalence of both meaning and style are necessary for faithful translation. The legal interpreter or translator must understand the complex characteristics and style of legal language. Court interpreting is a relatively young…
Two Interpretations of the Discrimination Parameter
ERIC Educational Resources Information Center
Tuerlinckx, Francis; De Boeck, Paul
2005-01-01
In this paper we propose two interpretations for the discrimination parameter in the two-parameter logistic model (2PLM). The interpretations are based on the relation between the 2PLM and two stochastic models. In the first interpretation, the 2PLM is linked to a diffusion model so that the probability of absorption equals the 2PLM. The…
Shen,W.; Wang, Y.; Shi, X.; Shah, N.; Huggins, F.; Bollineni, S.; Seehra, M.; Huffman, G.
2007-01-01
Nonoxidative decomposition of ethane was conducted over monometallic Ni and bimetallic Fe-Ni catalysts on basic Mg(Al)O support to produce H2 free of CO and CO2 and easily purified carbon nanotubes, a potentially valuable byproduct. The Mg(Al)O support was prepared by calcination of synthetic MgAl-hydrotalcite with a Mg to Al ratio of 5. The catalysts were prepared by incipient wetness with total metal loadings of 5 wt %. The dehydrogenation of undiluted ethane was conducted at temperatures of 500, 650, and 700 C. At 500 C, the Ni/Mg(Al)O catalyst was highly active and very stable with 100% conversion of ethane to 20 vol % H2 and 80 vol % CH4. However, the bimetallic Fe-Ni/Mg(Al)O exhibited its best performance at 650 C, yielding 65 vol % H2, 10 vol % CH4, and 25 vol % unreacted ethane. The product carbon was in the form of carbon nanotubes (CNT) at all three reaction temperatures, but the morphology of the CNT depended on both the catalyst composition and reaction temperature. The CNTs were formed by a tip-growth mechanism over the Mg(Al)O supported catalysts and were easily purified by a one-step dilute nitric acid treatment. Mossbauer spectroscopy, X-ray absorption fine structure spectroscopy, N2 adsorption-desorption isotherms, TEM, STEM, TGA, and XRD were used to characterize the catalysts and the CNT, revealing the catalytic mechanisms.
Interpretation of an urban scene using multi-channel radar imagery
NASA Technical Reports Server (NTRS)
Bryan, M. L.
1975-01-01
Four channel, SLAR imagery was studied by a group of individuals having no previous experience with either SLAR imagery or the urban area under scrutiny. This tactic was used because it was desired to define the nature of training needed when introducing people to radar imagery of urban scenes. Responses resulting from interpretations based on standard photointerpretation methods were subjected to a Chi-square analysis to determine the level of significance of the interpretations. For the urban scene studied, and for the two wavelengths (X /3.0 cm/ and L /23.0 cm/ Band) and polarizations (HH and HV) used, several types of urban land use were easily and accurately identified. It is shown that little formal training is required for obtaining quite high interpretation accuracies from multi-channel radar images of some urban scenes.
Local kinetic interpretation of entropy production through reversed diffusion
NASA Astrophysics Data System (ADS)
Porporato, A.; Kramer, P. R.; Cassiani, M.; Daly, E.; Mattingly, J.
2011-10-01
The time reversal of stochastic diffusion processes is revisited with emphasis on the physical meaning of the time-reversed drift and the noise prescription in the case of multiplicative noise. The local kinematics and mechanics of free diffusion are linked to the hydrodynamic description. These properties also provide an interpretation of the Pope-Ching formula for the steady-state probability density function along with a geometric interpretation of the fluctuation-dissipation relation. Finally, the statistics of the local entropy production rate of diffusion are discussed in the light of local diffusion properties, and a stochastic differential equation for entropy production is obtained using the Girsanov theorem for reversed diffusion. The results are illustrated for the Ornstein-Uhlenbeck process.
THINK Back: KNowledge-based Interpretation of High Throughput data.
Farfán, Fernando; Ma, Jun; Sartor, Maureen A; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Results of high throughput experiments can be challenging to interpret. Current approaches have relied on bulk processing the set of expression levels, in conjunction with easily obtained external evidence, such as co-occurrence. While such techniques can be used to reason probabilistically, they are not designed to shed light on what any individual gene, or a network of genes acting together, may be doing. Our belief is that today we have the information extraction ability and the computational power to perform more sophisticated analyses that consider the individual situation of each gene. The use of such techniques should lead to qualitatively superior results. The specific aim of this project is to develop computational techniques to generate a small number of biologically meaningful hypotheses based on observed results from high throughput microarray experiments, gene sequences, and next-generation sequences. Through the use of relevant known biomedical knowledge, as represented in published literature and public databases, we can generate meaningful hypotheses that will aide biologists to interpret their experimental data. We are currently developing novel approaches that exploit the rich information encapsulated in biological pathway graphs. Our methods perform a thorough and rigorous analysis of biological pathways, using complex factors such as the topology of the pathway graph and the frequency in which genes appear on different pathways, to provide more meaningful hypotheses to describe the biological phenomena captured by high throughput experiments, when compared to other existing methods that only consider partial information captured by biological pathways. PMID:22536867
NASA Astrophysics Data System (ADS)
Griffiths, J. B.; Krtous, P.; Podolský, J.
2006-12-01
The basic properties of the C-metric are well known. It describes a pair of causally separated black holes which accelerate in opposite directions under the action of forces represented by conical singularities. However, these properties can be demonstrated much more transparently by making use of recently developed coordinate systems for which the metric functions have a simple factor structure. These enable us to obtain explicit Kruskal Szekeres-type extensions through the horizons and construct two-dimensional conformal Penrose diagrams. We then combine these into a three-dimensional picture which illustrates the global causal structure of the spacetime outside the black hole horizons. Using both the weak field limit and some invariant quantities, we give a direct physical interpretation of the parameters which appear in the new form of the metric. For completeness, relations to other familiar coordinate systems are also discussed.
QUALITATIVE INTERPRETATION OF GALAXY SPECTRA
Sanchez Almeida, J.; Morales-Luis, A. B.; Terlevich, R.; Terlevich, E.; Cid Fernandes, R. E-mail: abml@iac.es E-mail: eterlevi@inaoep.mx
2012-09-10
We describe a simple step-by-step guide to qualitative interpretation of galaxy spectra. Rather than an alternative to existing automated tools, it is put forward as an instrument for quick-look analysis and for gaining physical insight when interpreting the outputs provided by automated tools. Though the recipe is for general application, it was developed for understanding the nature of the Automatic Spectroscopic K-means-based (ASK) template spectra. They resulted from the classification of all the galaxy spectra in the Sloan Digital Sky Survey data release 7, thus being a comprehensive representation of the galaxy spectra in the local universe. Using the recipe, we give a description of the properties of the gas and the stars that characterize the ASK classes, from those corresponding to passively evolving galaxies, to H II galaxies undergoing a galaxy-wide starburst. The qualitative analysis is found to be in excellent agreement with quantitative analyses of the same spectra. We compare the mean ages of the stellar populations with those inferred using the code STARLIGHT. We also examine the estimated gas-phase metallicity with the metallicities obtained using electron-temperature-based methods. A number of byproducts follow from the analysis. There is a tight correlation between the age of the stellar population and the metallicity of the gas, which is stronger than the correlations between galaxy mass and stellar age, and galaxy mass and gas metallicity. The galaxy spectra are known to follow a one-dimensional sequence, and we identify the luminosity-weighted mean stellar age as the affine parameter that describes the sequence. All ASK classes happen to have a significant fraction of old stars, although spectrum-wise they are outshined by the youngest populations. Old stars are metal-rich or metal-poor depending on whether they reside in passive galaxies or in star-forming galaxies.
Hunting Down Interpretations of the HERA Large-Q^2 data
John Ellis
1997-12-11
Possible interpretations of the HERA large-Q^2 data are reviewed briefly. The possibility of statistical fluctuations cannot be ruled out, and it seems premature to argue that the H1 and ZEUS anomalies are incompatible. The data cannot be explained away by modifications of parton distributions, nor do contact interactions help. A leptoquark interpretation would need a large tau-q branching ratio. Several R-violating squark interpretations are still viable despite all the constraints, and offer interesting experimental signatures, but please do not hold your breath.
Interface problems: Structural constraints on interpretation?
Frazier, Lyn; Clifton, Charles; Rayner, Keith; Deevy, Patricia; Koh, Sungryong; Bader, Markus
2006-01-01
Five experiments investigated the interpretation of quantified noun phrases in relation to discourse structure. They demonstrated, using questionnaire and on-line reading techniques, that readers in English prefer to give a quantified noun phrase in (VP-external) subject position a presuppositional interpretation, in which the noun phrase limits or restricts the interpretation of an already available set, rather than giving it a nonpresuppositional or existential interpretation, in which it introduces completely new entities into the discourse. Experiment 1 showed that readers prefer a presuppositional interpretation of three ships over the existential interpretation in Five ships appeared on the horizon. Three ships sank. Experiment 2 showed longer reading times in sentences that are disambiguated toward the existential interpretation than in sentences that permit the presuppositional interpretation. Experiment 3 suggested that the presuppositional preference is greater when the phrase three ships occurs outside the verb phrase than when it occurs inside the verb phrase. Experiment 4 showed that Korean subjects marked with a topic marker received more presuppositional interpretations than subjects marked with a nominative marker. Experiment 5 showed that German subjects in VP-external (but nontopic) position received more presuppositional interpretations than VP-internal subjects. The results suggest the syntactic position of a phrase is one determinant of its interpretation, as expected according to the mapping hypothesis of Diesing (1990). PMID:16050443
American Statistical Association: Statistics in Sports
NSDL National Science Digital Library
This section of the American Statistical Association website covers Statistics in Sports. Available here are a few older articles dealing with sports statistics and links to websites containing data for several professional and amateur sports, as well as websites with general news and information about sports, and a listing of official team websites for pro teams. A section called Statistics on the Web provides links to academic departments, conferences, and employers, while another section answers some frequently asked questions about sports statistics as a career. The website also provides an explanation of the Player Game Percentage (PGP) technique and uses the 2004 World Series as an example to demonstrate the technique. Educators will find a link to a website that offers suggestions of ways to incorporate sports statistics in the classroom.
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
Physics 630 Statistical Physics
Kioussis, Nicholas
Physics 630 Statistical Physics Spring 2005 Logistics Lecture Room: 1100 (Science I, 1st floor strongly the issue of problem solving and understanding of the main concepts in Statistical Physics and Statistical Mechanics at 430 level Textbook Statistical Mechanics, by Kerson Huang, Wiley, 2nd Edition
Festing, Michael F. W.
2014-01-01
The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a “statistically significant” effect is real or a false positive (type I error) due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p?=?0.05. However, by using standardised effect sizes (SESs) a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A “bootstrap” test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL) was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated. PMID:25426843
NEW INTERPRETIVE TRAINING GUIDE PUBLISHED Meaningful Interpretation: How to Connect Hearts and Minds
Coble, Theresa G.
NEW INTERPRETIVE TRAINING GUIDE PUBLISHED Meaningful Interpretation: How to Connect Hearts as a profession. I believe that this "Meaningful Interpretation" book is just the kind of professional literature that we need," said Corky Mayo, Chief of Interpretation of the National Park Service. Meaningful
ERIC Educational Resources Information Center
Napier, Jemina; Barker, Roz
2004-01-01
This article presents the findings of the first linguistic analysis of sign language interpreting carried out in Australia. A study was conducted on 10 Australian Sign Language/English interpreters to determine the rate and occurrence of interpreting omissions and the interpreters' level of metalinguistic awareness in relation to their production…
ERIC Educational Resources Information Center
Diemer, Roberta A.; And Others
1996-01-01
Twenty-five distressed adult clients received 2 sessions each of dream and event interpretation using the Hill model during 12 sessions of successful therapy. No differences were found in depth, insight, and working alliance among dream interpretation, event interpretation, and unstructured sessions, suggesting that dream interpretation is as…
Measuring statistical heterogeneity: The Pietra index
NASA Astrophysics Data System (ADS)
Eliazar, Iddo I.; Sokolov, Igor M.
2010-01-01
There are various ways of quantifying the statistical heterogeneity of a given probability law: Statistics uses variance - which measures the law’s dispersion around its mean; Physics and Information Theory use entropy - which measures the law’s randomness; Economics uses the Gini index - which measures the law’s egalitarianism. In this research we explore an alternative to the Gini index-the Pietra index-which is a counterpart of the Kolmogorov-Smirnov statistic. The Pietra index is shown to be a natural and elemental measure of statistical heterogeneity, which is especially useful in the case of asymmetric and skewed probability laws, and in the case of asymptotically Paretian laws with finite mean and infinite variance. Moreover, the Pietra index is shown to have immediate and fundamental interpretations within the following applications: renewal processes and continuous time random walks; infinite-server queueing systems and shot noise processes; financial derivatives. The interpretation of the Pietra index within the context of financial derivatives implies that derivative markets, in effect, use the Pietra index as their benchmark measure of statistical heterogeneity.
ERIC Educational Resources Information Center
Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.
2012-01-01
Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…
Relations Between Random Coding Exponents and the Statistical Physics of Random Codes
Merhav, Neri
Relations Between Random Coding Exponents and the Statistical Physics of Random Codes Neri Merhav incorrect codewords. We show that the statistical physics associated with the two latter phases be mapped onto (and interpreted as) analogous problems in the area of statistical physics of disordered
Correlated adaptation of agents in a simple market: a statistical physics perspective
Moro, Esteban
Correlated adaptation of agents in a simple market: a statistical physics perspective J. P of the statistical physics of manybody systems. The only information about other agents available to any one an interpretation of the results from the point of view of the statistical physics of disordered systems. 1
Hill, Jeffrey E.
-test to data. · Understand the problems associated with multiple statistical testing. 9 9. ANOVA · Identify the types of data and experiments that an ANOVA is appropriate for. · Run an ANOVA in R · Calculate an F-statistic. · Test hypothesis with ANOVA · Interpret an ANOVA table and report ANOVA statistics. · Graphically
Devanathan, S; Dahl, T A; Midden, W R; Neckers, D C
1990-01-01
Fluorescein-labeled antibodies have little, if any, photodynamic effect because energy acquired by light absorption is rapidly dissipated in fluorescence. However, they can be easily and efficiently converted to selective photodynamic sensitizers by iodination under mild conditions. We have outlined general experimental procedures that can be used to turn a fluorescein-labeled anti-Escherichia coli antibody into a photodynamic sensitizer that selectively kills E. coli while sparing closely related Salmonella typhimurium. These results demonstrate that iodination did not destroy the specificity or activity of the antibody. This technique should be applicable to the large number of fluoresceinated antibodies that are commercially available. Thus, this strategy provides a simple way to rapidly prepare a large number of targeted phototoxic agents that can be used for the selective destruction with light of nearly any type of tissue or organism. PMID:2109321
Armbruster, D A
1989-01-01
A primary responsibility of the laboratory manager is to obtain analytical instrumentation for his or her facility. Traditionally, analyzers are purchased as capital investments. An increasingly popular alternative is the reagent lease/rental agreement. A manufacturer provides a laboratory with an analyzer with the provision that the laboratory will purchase the reagents from the manufacturer. Reagents are purchased at a set cost per test, which varies with test volume, and this price incorporates a charge for the use of the instrument. The laboratory manager can acquire a state-of-the-art analyzer more quickly and easily than through a purchase and can maintain the flexibility to switch to a more suitable system as technology and service requirements change. Significant advantages and disadvantages accrue to both the laboratory and the manufacturer. The reagent lease/rental agreement is an option definitely worth considering by laboratory managers. PMID:10294063
Environmental statistics and optimal regulation
NASA Astrophysics Data System (ADS)
Sivak, David; Thomson, Matt
2015-03-01
The precision with which an organism can detect its environment, and the timescale for and statistics of environmental change, will affect the suitability of different strategies for regulating protein levels in response to environmental inputs. We propose a general framework--here applied to the enzymatic regulation of metabolism in response to changing nutrient concentrations--to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, and the costs associated with enzyme production. We find: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.
Statistical analysis of planetary surfaces
NASA Astrophysics Data System (ADS)
Schmidt, Frederic; Landais, Francois; Lovejoy, Shaun
2015-04-01
In the last decades, a huge amount of topographic data has been obtained by several techniques (laser and radar altimetry, DTM…) for different bodies in the solar system, including Earth, Mars, the Moon etc.. In each case, topographic fields exhibit an extremely high variability with details at each scale, from millimeter to thousands of kilometers. This complexity seems to prohibit global descriptions or global topography models. Nevertheless, this topographic complexity is well-known to exhibit scaling laws that establish a similarity between scales and permit simpler descriptions and models. Indeed, efficient simulations can be made using the statistical properties of scaling fields (fractals). But realistic simulations of global topographic fields must be multi (not mono) scaling behaviour, reflecting the extreme variability and intermittency observed in real fields that can not be generated by simple scaling models. A multiscaling theory has been developed in order to model high variability and intermittency. This theory is a good statistical candidate to model the topography field with a limited number of parameters (called the multifractal parameters). In our study, we show that statistical properties of the Martian topography is accurately reproduced by this model, leading to new interpretation of geomorphological processes.
Statistical Applets: Animated Exercise
NSDL National Science Digital Library
Duckworth, William
This collection of statistical applets is designed to accompany the textbook, "Practice of Business Statistics." The applets can be used without the textbook and cover many introductory statistics concepts including mean, normal curve, correlation and regression, probability, the law of large numbers, the central limit theorem, confidence intervals, statistical significance, power, and ANOVA. This is a great collection of interactive materials for either instructors or students studying statistics.
QUANTUM STATISTICAL CORRECTIONS TO ASTROPHYSICAL PHOTODISINTEGRATION RATES
Mathews, G. J.; Pehlivan, Yamac; Kajino, Toshitaka; Balantekin, A. B.; Kusakabe, Motohiko E-mail: yamac@physics.wisc.edu E-mail: baha@physics.wisc.edu
2011-01-20
Tabulated rates for astrophysical photodisintegration reactions make use of Boltzmann statistics for the photons involved as well as the interacting nuclei. Here, we derive analytic corrections for the Planck-spectrum quantum statistics of the photon energy distribution. These corrections can be deduced directly from the detailed balance condition without the assumption of equilibrium as long as the photons are represented by a Planck spectrum. Moreover, we show that these corrections affect not only the photodisintegration rates but also modify the conditions of nuclear statistical equilibrium as represented in the Saha equation. We deduce new analytic corrections to the classical Maxwell-Boltzmann statistics which can easily be added to the reverse reaction rates of existing reaction network tabulations. We show that the effects of quantum statistics, though generally quite small, always tend to speed up photodisintegration rates and are largest for nuclei and environments for which Q/kT {approx} 1. As an illustration, we examine possible effects of these corrections on the r-process, the rp-process, explosive silicon burning, the {gamma}-process, and big bang nucleosynthesis. We find that in most cases one is quite justified in neglecting these corrections. The correction is largest for reactions near the drip line for an r-process with very high neutron density, or an rp-process at high temperature.
Psychoanalytic interpretation and cognitive transformation.
Basch, M F
1981-01-01
Psychoanalytic metapsychology should be recognized for what it is, namely a theory of cognition and affect that is not derived directly from clinical data but is advanced in order to provide the development background that will let us deal with the clinical findings of psychoanalysis as aberrations of and deviations from the normal and expected evolution of the thinking process. Its cornerstone is Freud's belief that thought depends on the forging of links between the sensory perception of objects and their appropriate verbal descriptions. He made no secret of his dissatisfaction with his metapsychology and repeatedly revised it in an attempt to encompass those clinical discoveries of psychoanalysis that outstripped the explanatory power of that metapsychology and demonstrated its shortcomings. Using what we now know about normal development in infancy and childhood through the work of Piaget, Vygotsky and other investigators, it is possible to formulate an explanatory theory that does justice to the varied and complex findings uncovered by the application of the psychoanalytic method. For example, the significance of Freud's postulated second censorship between the preconscious and consciousness, as well as the importance of the defence of disavowal that Freud emphasized in his writings after 1927, can now be accounted for with a theory of thought formation that was not available to the founder of psychoanalysis. The implications of this proposed reformulation for psychoanalytic interpretation and for the application of psychoanalysis to an increasingly wide range of psychopathology is discussed in some detail. PMID:7275491
Visualization techniques for statistical circuit design
Sengupta, Manidip
1992-01-01
1992 Major Subject: Electrical Engineering VISUALIZATION TECHNIQUES FOR STATISTICAL CIRCUIT DESIGN A Thesis by MANIDIP SENGUPTA Approved as to style and content by: M. A. tyblins i (Chair o Committee) C. Singh (Member) D. R. Halverson... the use of visualization techniques to interpret the behavior of the circuit in a qualitative fashion. The objects of visualization can be listed as follows: ~ Nominal design point for designable parameters: The optimization process con- tinually shifts...
NASA Astrophysics Data System (ADS)
Hu, Chongqing; Li, Aihua; Zhao, Xingyang
2011-02-01
This paper proposes a multivariate statistical analysis approach to processing the instantaneous engine speed signal for the purpose of locating multiple misfire events in internal combustion engines. The state of each cylinder is described with a characteristic vector extracted from the instantaneous engine speed signal following a three-step procedure. These characteristic vectors are considered as the values of various procedure parameters of an engine cycle. Therefore, determination of occurrence of misfire events and identification of misfiring cylinders can be accomplished by a principal component analysis (PCA) based pattern recognition methodology. The proposed algorithm can be implemented easily in practice because the threshold can be defined adaptively without the information of operating conditions. Besides, the effect of torsional vibration on the engine speed waveform is interpreted as the presence of super powerful cylinder, which is also isolated by the algorithm. The misfiring cylinder and the super powerful cylinder are often adjacent in the firing sequence, thus missing detections and false alarms can be avoided effectively by checking the relationship between the cylinders.
A bird's eye view: the cognitive strategies of experts interpreting seismic profiles
NASA Astrophysics Data System (ADS)
Bond, C. E.; Butler, R.
2012-12-01
Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that techniques and strategies are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments we have focused on a small number of experts to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.
STATISTICAL THINKING IN NEUROSCIENCE Department of Statistics
Spirtes, Peter
for Statistics in Neuroscience #12;Report of ASA Working Group on Statistics and Brain of the properties of vision hat are familliar to us from behavioral xperiments on animnals, from psycho- hysical, from the laterial eye of Limtlol/is, stimulated by illumination of the facet aissociated with its
A new statistical tool for NOAA local climate studies
NASA Astrophysics Data System (ADS)
Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.
2011-12-01
The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially-weighted moving average. Root mean squared error is used to determine the best fit of trend to the observations with the least error. The studies of climate variability impacts on local extremes use composite techniques applied to various definitions of local variables: from specified percentiles to critical thresholds. Drought studies combine visual capabilities of Google maps with statistical estimates of drought severity indices. The process of development will be linked to local office interactions with users to ensure the tool will meet their needs as well as provide adequate training. A rigorous internal and tiered peer-review process will be implemented to ensure the studies are scientifically-sound that will be published and submitted to the local studies catalog (database) and eventually to external sources, such as the Climate Portal.
Primer of statistics in dental research: part I.
Shintani, Ayumi
2014-01-01
Statistics play essential roles in evidence-based dentistry (EBD) practice and research. It ranges widely from formulating scientific questions, designing studies, collecting and analyzing data to interpreting, reporting, and presenting study findings. Mastering statistical concepts appears to be an unreachable goal among many dental researchers in part due to statistical authorities' limitations of explaining statistical principles to health researchers without elaborating complex mathematical concepts. This series of 2 articles aim to introduce dental researchers to 9 essential topics in statistics to conduct EBD with intuitive examples. The part I of the series includes the first 5 topics (1) statistical graph, (2) how to deal with outliers, (3) p-value and confidence interval, (4) testing equivalence, and (5) multiplicity adjustment. Part II will follow to cover the remaining topics including (6) selecting the proper statistical tests, (7) repeated measures analysis, (8) epidemiological consideration for causal association, and (9) analysis of agreement. PMID:24461958
Working with interpreters: practical advice for use of an interpreter in healthcare.
Hadziabdic, Emina; Hjelm, Katarina
2013-03-01
The aim of this descriptive commentary is to improve communication in healthcare when an interpreter is used by providing practical advice to healthcare staff when they consider using interpreters. This descriptive commentary considered the issues of preparation and implementation of interpretation sessions to reveal the complexities and dilemmas of an effective healthcare encounter with interpreters. Using the design of a discursive paper, this article seeks to explore and position of what is published in the literature on the topic studied and on the basis of previous studies to provide practical advice on the use of interpreters. The descriptive commentary showed that the interpreter should be used not only as a communication aid but also as a practical and informative guide in the healthcare system. In preparing the interpretation session, it is important to consider the type (trained professional interpreter, family member or bilingual healthcare staff as interpreters) and mode (face to face and telephone) of interpreting. Furthermore, it is important to consider the interpreter's ethnic origin, religious background, gender, language or dialect, social group, clothes, appearance and attitude. During the healthcare encounter, the interpreter should follow the recommendations given in guidelines for interpreters. Healthcare staff should choose an appropriate room and be aware of their own behaviour, appearance and attitude during the healthcare encounter. Good planning is needed, with carefully considered choices concerning the right kind of interpreter, mode of interpretation and individual preferences for the interpretation in order to deliver high-quality and cost-effective healthcare. Depending on the nature of the healthcare encounter, healthcare staff need to plan interpreting carefully and in accordance with the individuals' desires and choose the type of interpreter and mode of interpreting that best suits the need in the actual healthcare situation in order to deliver high-quality healthcare. PMID:23448332
Concurrent behavior: Are the interpretations mutually exclusive?
Lyon, David O.
1982-01-01
The experimental literature is replete with examples of behavior which occur concurrently with a schedule of reinforcement. These concurrent behaviors, often with similar topographies and occurring under like circumstances, may be interpreted as functionally autonomous, collateral, adjunctive, superstitious or mediating behavior. The degree to which the interaction of concurrent and schedule controlled behavior is used in the interpretation of behavior illustrated the importance of distinguishing among these interpretations by experimental procedure. The present paper reviews the characteristics of these interpretations, and discusses the experimental procedures necessary to distinguish among them. The paper concludes that the interpretations are mutually exclusive and refer to distinct behaviors, but that the distinction between any two of the interpretations requires more than one experimental procedure. PMID:22478568
Interpretations of Quantum Mechanics: a critical survey
Caponigro, Michele
2008-01-01
This brief survey analyzes the epistemological implications about the role of observer in the interpretations of Quantum Mechanics. As we know, the goal of most interpretations of quantum mechanics is to avoid the apparent intrusion of the observer into the measurement process. In the same time, there are implicit and hidden assumptions about his role. In fact, most interpretations taking as ontic level one of these fundamental concepts as information, physical law and matter bring us to new problematical questions. We think, that no interpretation of the quantum theory can avoid this intrusion until we do not clarify the nature of observer.
Interpretations of Quantum Mechanics: a critical survey
Michele Caponigro
2008-11-24
This brief survey analyzes the epistemological implications about the role of observer in the interpretations of Quantum Mechanics. As we know, the goal of most interpretations of quantum mechanics is to avoid the apparent intrusion of the observer into the measurement process. In the same time, there are implicit and hidden assumptions about his role. In fact, most interpretations taking as ontic level one of these fundamental concepts as information, physical law and matter bring us to new problematical questions. We think, that no interpretation of the quantum theory can avoid this intrusion until we do not clarify the nature of observer.
Torres-Verdín, Carlos
the interpretation of PNC logs challenging in economic pay zones and hindering the accurate appraisal of layer). The statistical process of modeling the interactions of individual nuclear particles with materials is achieved functions (FSFs) -- together with a 1D (vertical) neutron-diffusion correction. The speed and accu- racy
Distributed data collection for a database of radiological image interpretations
NASA Astrophysics Data System (ADS)
Long, L. Rodney; Ostchega, Yechiam; Goh, Gin-Hua; Thoma, George R.
1997-01-01
The National Library of Medicine, in collaboration with the National Center for Health Statistics and the National Institute for Arthritis and Musculoskeletal and Skin Diseases, has built a system for collecting radiological interpretations for a large set of x-ray images acquired as part of the data gathered in the second National Health and Nutrition Examination Survey. This system is capable of delivering across the Internet 5- and 10-megabyte x-ray images to Sun workstations equipped with X Window based 2048 X 2560 image displays, for the purpose of having these images interpreted for the degree of presence of particular osteoarthritic conditions in the cervical and lumbar spines. The collected interpretations can then be stored in a database at the National Library of Medicine, under control of the Illustra DBMS. This system is a client/server database application which integrates (1) distributed server processing of client requests, (2) a customized image transmission method for faster Internet data delivery, (3) distributed client workstations with high resolution displays, image processing functions and an on-line digital atlas, and (4) relational database management of the collected data.