Graham, Daniel J; Field, David J
2008-01-01
Two recent studies suggest that natural scenes and paintings show similar statistical properties. But does the content or region of origin of an artwork affect its statistical properties? We addressed this question by having judges place paintings from a large, diverse collection of paintings into one of three subject-matter categories using a forced-choice paradigm. Basic statistics for images whose caterogization was agreed by all judges showed no significant differences between those judged to be 'landscape' and 'portrait/still-life', but these two classes differed from paintings judged to be 'abstract'. All categories showed basic spatial statistical regularities similar to those typical of natural scenes. A test of the full painting collection (140 images) with respect to the works' place of origin (provenance) showed significant differences between Eastern works and Western ones, differences which we find are likely related to the materials and the choice of background color. Although artists deviate slightly from reproducing natural statistics in abstract art (compared to representational art), the great majority of human art likely shares basic statistical limitations. We argue that statistical regularities in art are rooted in the need to make art visible to the eye, not in the inherent aesthetic value of natural-scene statistics, and we suggest that variability in spatial statistics may be generally imposed by manufacture.
Thermodynamics and statistical mechanics. [thermodynamic properties of gases
NASA Technical Reports Server (NTRS)
1976-01-01
The basic thermodynamic properties of gases are reviewed and the relations between them are derived from the first and second laws. The elements of statistical mechanics are then formulated and the partition function is derived. The classical form of the partition function is used to obtain the Maxwell-Boltzmann distribution of kinetic energies in the gas phase and the equipartition of energy theorem is given in its most general form. The thermodynamic properties are all derived as functions of the partition function. Quantum statistics are reviewed briefly and the differences between the Boltzmann distribution function for classical particles and the Fermi-Dirac and Bose-Einstein distributions for quantum particles are discussed.
Statistical regularities of art images and natural scenes: spectra, sparseness and nonlinearities.
Graham, Daniel J; Field, David J
2007-01-01
Paintings are the product of a process that begins with ordinary vision in the natural world and ends with manipulation of pigments on canvas. Because artists must produce images that can be seen by a visual system that is thought to take advantage of statistical regularities in natural scenes, artists are likely to replicate many of these regularities in their painted art. We have tested this notion by computing basic statistical properties and modeled cell response properties for a large set of digitized paintings and natural scenes. We find that both representational and non-representational (abstract) paintings from our sample (124 images) show basic similarities to a sample of natural scenes in terms of their spatial frequency amplitude spectra, but the paintings and natural scenes show significantly different mean amplitude spectrum slopes. We also find that the intensity distributions of paintings show a lower skewness and sparseness than natural scenes. We account for this by considering the range of luminances found in the environment compared to the range available in the medium of paint. A painting's range is limited by the reflective properties of its materials. We argue that artists do not simply scale the intensity range down but use a compressive nonlinearity. In our studies, modeled retinal and cortical filter responses to the images were less sparse for the paintings than for the natural scenes. But when a compressive nonlinearity was applied to the images, both the paintings' sparseness and the modeled responses to the paintings showed the same or greater sparseness compared to the natural scenes. This suggests that artists achieve some degree of nonlinear compression in their paintings. Because paintings have captivated humans for millennia, finding basic statistical regularities in paintings' spatial structure could grant insights into the range of spatial patterns that humans find compelling.
ERIC Educational Resources Information Center
Rupp, Andre A.
2007-01-01
One of the most revolutionary advances in psychometric research during the last decades has been the systematic development of statistical models that allow for cognitive psychometric research (CPR) to be conducted. Many of the models currently available for such purposes are extensions of basic latent variable models in item response theory…
Assembly of Ultra-Dense Nanowire-Based Computing Systems
2006-06-30
34* characterized basic device element properties and statistics "* demonstrated product of sums (POS) validating assembled 2-bit adder structures " Demonstrated...linear region (Vds= 10 mV) from the peak g = 3 jiS at IVg -VTI= 0.13 V using the charge control model, representsmore than a factor of 10 improvement over...disrupted by ionizing particles or thermal fluctuation. Further, when working with such small charges, it is statistically possible that logic
Reinventing Biostatistics Education for Basic Scientists
Weissgerber, Tracey L.; Garovic, Vesna D.; Milin-Lazovic, Jelena S.; Winham, Stacey J.; Obradovic, Zoran; Trzeciakowski, Jerome P.; Milic, Natasa M.
2016-01-01
Numerous studies demonstrating that statistical errors are common in basic science publications have led to calls to improve statistical training for basic scientists. In this article, we sought to evaluate statistical requirements for PhD training and to identify opportunities for improving biostatistics education in the basic sciences. We provide recommendations for improving statistics training for basic biomedical scientists, including: 1. Encouraging departments to require statistics training, 2. Tailoring coursework to the students’ fields of research, and 3. Developing tools and strategies to promote education and dissemination of statistical knowledge. We also provide a list of statistical considerations that should be addressed in statistics education for basic scientists. PMID:27058055
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, Sarah
2015-12-01
The dual objectives of this project were improving our basic understanding of processes that control cirrus microphysical properties and improvement of the representation of these processes in the parameterizations. A major effort in the proposed research was to integrate, calibrate, and better understand the uncertainties in all of these measurements.
NASA Astrophysics Data System (ADS)
Gogu, C.; Haftka, R.; LeRiche, R.; Molimard, J.; Vautrin, A.; Sankar, B.
2008-11-01
The basic formulation of the least squares method, based on the L2 norm of the misfit, is still widely used today for identifying elastic material properties from experimental data. An alternative statistical approach is the Bayesian method. We seek here situations with significant difference between the material properties found by the two methods. For a simple three bar truss example we illustrate three such situations in which the Bayesian approach leads to more accurate results: different magnitude of the measurements, different uncertainty in the measurements and correlation among measurements. When all three effects add up, the Bayesian approach can have a large advantage. We then compared the two methods for identification of elastic constants from plate vibration natural frequencies.
Quantifying randomness in real networks
NASA Astrophysics Data System (ADS)
Orsini, Chiara; Dankulov, Marija M.; Colomer-de-Simón, Pol; Jamakovic, Almerima; Mahadevan, Priya; Vahdat, Amin; Bassler, Kevin E.; Toroczkai, Zoltán; Boguñá, Marián; Caldarelli, Guido; Fortunato, Santo; Krioukov, Dmitri
2015-10-01
Represented as graphs, real networks are intricate combinations of order and disorder. Fixing some of the structural properties of network models to their values observed in real networks, many other properties appear as statistical consequences of these fixed observables, plus randomness in other respects. Here we employ the dk-series, a complete set of basic characteristics of the network structure, to study the statistical dependencies between different network properties. We consider six real networks--the Internet, US airport network, human protein interactions, technosocial web of trust, English word network, and an fMRI map of the human brain--and find that many important local and global structural properties of these networks are closely reproduced by dk-random graphs whose degree distributions, degree correlations and clustering are as in the corresponding real network. We discuss important conceptual, methodological, and practical implications of this evaluation of network randomness, and release software to generate dk-random graphs.
Information-Theoretic Properties of Auditory Sequences Dynamically Influence Expectation and Memory
ERIC Educational Resources Information Center
Agres, Kat; Abdallah, Samer; Pearce, Marcus
2018-01-01
A basic function of cognition is to detect regularities in sensory input to facilitate the prediction and recognition of future events. It has been proposed that these implicit expectations arise from an internal predictive coding model, based on knowledge acquired through processes such as statistical learning, but it is unclear how different…
ERIC Educational Resources Information Center
Snell, Joel C.; Marsh, Mitchell
2011-01-01
The authors have over the years tried to revise meta-analysis because it's basic premise is to add apples and oranges together and analyze. In other words, various data on the same subject are chosen using different samples, research strategies, and number properties. The findings are then homogenized and a statistical analysis is used (Snell, J.…
A Wave Chaotic Study of Quantum Graphs with Microwave Networks
NASA Astrophysics Data System (ADS)
Fu, Ziyuan
Quantum graphs provide a setting to test the hypothesis that all ray-chaotic systems show universal wave chaotic properties. I study the quantum graphs with a wave chaotic approach. Here, an experimental setup consisting of a microwave coaxial cable network is used to simulate quantum graphs. Some basic features and the distributions of impedance statistics are analyzed from experimental data on an ensemble of tetrahedral networks. The random coupling model (RCM) is applied in an attempt to uncover the universal statistical properties of the system. Deviations from RCM predictions have been observed in that the statistics of diagonal and off-diagonal impedance elements are different. Waves trapped due to multiple reflections on bonds between nodes in the graph most likely cause the deviations from universal behavior in the finite-size realization of a quantum graph. In addition, I have done some investigations on the Random Coupling Model, which are useful for further research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoilova, N. I.
Generalized quantum statistics, such as paraboson and parafermion statistics, are characterized by triple relations which are related to Lie (super)algebras of type B. The correspondence of the Fock spaces of parabosons, parafermions as well as the Fock space of a system of parafermions and parabosons to irreducible representations of (super)algebras of type B will be pointed out. Example of generalized quantum statistics connected to the basic classical Lie superalgebra B(1|1) ≡ osp(3|2) with interesting physical properties, such as noncommutative coordinates, will be given. Therefore the article focuses on the question, addressed already in 1950 by Wigner: do the equation ofmore » motion determine the quantum mechanical commutation relation?.« less
Watanabe, Hayafumi; Sano, Yukie; Takayasu, Hideki; Takayasu, Misako
2016-11-01
To elucidate the nontrivial empirical statistical properties of fluctuations of a typical nonsteady time series representing the appearance of words in blogs, we investigated approximately 3×10^{9} Japanese blog articles over a period of six years and analyze some corresponding mathematical models. First, we introduce a solvable nonsteady extension of the random diffusion model, which can be deduced by modeling the behavior of heterogeneous random bloggers. Next, we deduce theoretical expressions for both the temporal and ensemble fluctuation scalings of this model, and demonstrate that these expressions can reproduce all empirical scalings over eight orders of magnitude. Furthermore, we show that the model can reproduce other statistical properties of time series representing the appearance of words in blogs, such as functional forms of the probability density and correlations in the total number of blogs. As an application, we quantify the abnormality of special nationwide events by measuring the fluctuation scalings of 1771 basic adjectives.
ERIC Educational Resources Information Center
Noser, Thomas C.; Tanner, John R.; Shah, Situl
2008-01-01
The purpose of this study was to measure the comprehension of basic mathematical skills of students enrolled in statistics classes at a large regional university, and to determine if the scores earned on a basic math skills test are useful in forecasting student performance in these statistics classes, and to determine if students' basic math…
A κ-generalized statistical mechanics approach to income analysis
NASA Astrophysics Data System (ADS)
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2009-02-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.
An Assessment of the Impact of the Department of Defense Very-High-Speed Integrated Circuit Program.
1982-01-01
analysis, statistical inference, device physics and other such products of basic research. Examples of such information would be: analyses of properties of...TB , for a n-p-n silicon transitor with 1018 cm- 3 base-doping, TB = Wb 2/2Dw becomes 0.4 ps in this limit so that the base contributes little to delay
NASA Astrophysics Data System (ADS)
Athiyamaan, V.; Mohan Ganesh, G.
2017-11-01
Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.
Niclasen, Janni; Keilow, Maria; Obel, Carsten
2018-05-01
Well-being is considered a prerequisite for learning. The Danish Ministry of Education initiated the development of a new 40-item student well-being questionnaire in 2014 to monitor well-being among all Danish public school students on a yearly basis. The aim of this study was to investigate the basic psychometric properties of this questionnaire. We used the data from the 2015 Danish student well-being survey for 268,357 students in grades 4-9 (about 85% of the study population). Descriptive statistics, exploratory factor analyses, confirmatory factor analyses and Cronbach's α reliability measures were used in the analyses. The factor analyses did not unambiguously support one particular factor structure. However, based on the basic descriptive statistics, exploratory factor analyses, confirmatory factor analyses, the semantics of the individual items and Cronbach's α, we propose a four-factor structure including 27 of the 40 items originally proposed. The four scales measure school connectedness, learning self-efficacy, learning environment and classroom management. Two bullying items and two psychosomatic items should be considered separately, leaving 31 items in the questionnaire. The proposed four-factor structure addresses central aspects of well-being, which, if used constructively, may support public schools' work to increase levels of student well-being.
NASA Astrophysics Data System (ADS)
Eliaš, Peter; Frič, Roman
2017-12-01
Categorical approach to probability leads to better understanding of basic notions and constructions in generalized (fuzzy, operational, quantum) probability, where observables—dual notions to generalized random variables (statistical maps)—play a major role. First, to avoid inconsistencies, we introduce three categories L, S, and P, the objects and morphisms of which correspond to basic notions of fuzzy probability theory and operational probability theory, and describe their relationships. To illustrate the advantages of categorical approach, we show that two categorical constructions involving observables (related to the representation of generalized random variables via products, or smearing of sharp observables, respectively) can be described as factorizing a morphism into composition of two morphisms having desired properties. We close with a remark concerning products.
Properties of JP=1/2+ baryon octets at low energy
NASA Astrophysics Data System (ADS)
Kaur, Amanpreet; Gupta, Pallavi; Upadhyay, Alka
2017-06-01
The statistical model in combination with the detailed balance principle is able to phenomenologically calculate and analyze spin- and flavor-dependent properties like magnetic moments (with effective masses, with effective charge, or with both effective mass and effective charge), quark spin polarization and distribution, the strangeness suppression factor, and \\overline{d}-\\overline{u} asymmetry incorporating the strange sea. The s\\overline{s} in the sea is said to be generated via the basic quark mechanism but suppressed by the strange quark mass factor ms>m_{u,d}. The magnetic moments of the octet baryons are analyzed within the statistical model, by putting emphasis on the SU(3) symmetry-breaking effects generated by the mass difference between the strange and non-strange quarks. The work presented here assumes hadrons with a sea having an admixture of quark gluon Fock states. The results obtained have been compared with theoretical models and experimental data.
Photon Limited Images and Their Restoration
1976-03-01
arises from noise inherent in the detected image data. In the first part of this report a model is developed which can be used to mathematically and...statistically describe an image detected at low light levels. This rodel serves to clarify some basic properties of photon noise , and provides a basis...for the analysi.s of image restoration. In the second part the problem of linear least-square restoration of imagery limited by photon noise is
Exploration in free word association networks: models and experiment.
Ludueña, Guillermo A; Behzad, Mehran Djalali; Gros, Claudius
2014-05-01
Free association is a task that requires a subject to express the first word to come to their mind when presented with a certain cue. It is a task which can be used to expose the basic mechanisms by which humans connect memories. In this work, we have made use of a publicly available database of free associations to model the exploration of the averaged network of associations using a statistical and the adaptive control of thought-rational (ACT-R) model. We performed, in addition, an online experiment asking participants to navigate the averaged network using their individual preferences for word associations. We have investigated the statistics of word repetitions in this guided association task. We find that the considered models mimic some of the statistical properties, viz the probability of word repetitions, the distance between repetitions and the distribution of association chain lengths, of the experiment, with the ACT-R model showing a particularly good fit to the experimental data for the more intricate properties as, for instance, the ratio of repetitions per length of association chains.
Spontaneous ultraweak photon emission from biological systems and the endogenous light field.
Schwabl, Herbert; Klima, Herbert
2005-04-01
Still one of the most astonishing biological electromagnetic phenomena is the ultraweak photon emission (UPE) from living systems. Organisms and tissues spontaneously emit measurable intensities of light, i.e. photons in the visible part of the electromagnetic spectrum (380-780 nm), in the range from 1 to 1,000 photons x s-1 x cm-2, depending on their condition and vitality. It is important not to confuse UPE from living systems with other biogenic light emitting processes such as bioluminescence or chemiluminescence. This article examines with basic considerations from physics on the quantum nature of photons the empirical phenomenon of UPE. This leads to the description of the non-thermal origin of this radiation. This is in good correspondence with the modern understanding of life phenomena as dissipative processes far from thermodynamic equilibrium. UPE also supports the understanding of life sustaining processes as basically driven by electromagnetic fields. The basic features of UPE, like intensity and spectral distribution, are known in principle for many experimental situations. The UPE of human leukocytes contributes to an endogenous light field of about 1011 photons x s-1 which can be influenced by certain factors. Further research is needed to reveal the statistical properties of UPE and in consequence to answer questions about the underlying mechanics of the biological system. In principle, statistical properties of UPE allow to reconstruct phase-space dynamics of the light emitting structures. Many open questions remain until a proper understanding of the electromagnetic interaction of the human organism can be achieved: which structures act as receptors and emitters for electromagnetic radiation? How is electromagnetic information received and processed within cells?
Watersheds in disordered media
NASA Astrophysics Data System (ADS)
Andrade, Joséi, Jr.; Araújo, Nuno; Herrmann, Hans; Schrenk, Julian
2015-02-01
What is the best way to divide a rugged landscape? Since ancient times, watersheds separating adjacent water systems that flow, for example, toward different seas, have been used to delimit boundaries. Interestingly, serious and even tense border disputes between countries have relied on the subtle geometrical properties of these tortuous lines. For instance, slight and even anthropogenic modifications of landscapes can produce large changes in a watershed, and the effects can be highly nonlocal. Although the watershed concept arises naturally in geomorphology, where it plays a fundamental role in water management, landslide, and flood prevention, it also has important applications in seemingly unrelated fields such as image processing and medicine. Despite the far-reaching consequences of the scaling properties on watershed-related hydrological and political issues, it was only recently that a more profound and revealing connection has been disclosed between the concept of watershed and statistical physics of disordered systems. This review initially surveys the origin and definition of a watershed line in a geomorphological framework to subsequently introduce its basic geometrical and physical properties. Results on statistical properties of watersheds obtained from artificial model landscapes generated with long-range correlations are presented and shown to be in good qualitative and quantitative agreement with real landscapes.
Imprints of magnetic power and helicity spectra on radio polarimetry statistics
NASA Astrophysics Data System (ADS)
Junklewitz, H.; Enßlin, T. A.
2011-06-01
The statistical properties of turbulent magnetic fields in radio-synchrotron sources should be imprinted on the statistics of polarimetric observables. In search of these imprints, i.e. characteristic modifications of the polarimetry statistics caused by magnetic field properties, we calculate correlation and cross-correlation functions from a set of observables that contain total intensity I, polarized intensity P, and Faraday depth φ. The correlation functions are evaluated for all combinations of observables up to fourth order in magnetic field B. We derive these analytically as far as possible and from first principles using only some basic assumptions, such as Gaussian statistics for the underlying magnetic field in the observed region and statistical homogeneity. We further assume some simplifications to reduce the complexity of the calculations, because for a start we were interested in a proof of concept. Using this statistical approach, we show that it is possible to gain information about the helical part of the magnetic power spectrum via the correlation functions < P(kperp) φ(k'_{perp)φ(k''perp)>B} and < I(kperp) φ(k'_{perp)φ(k''perp)>B}. Using this insight, we construct an easy-to-use test for helicity called LITMUS (Local Inference Test for Magnetic fields which Uncovers heliceS), which gives a spectrally integrated measure of helicity. For now, all calculations are given in a Faraday-free case, but set up so that Faraday rotational effects can be included later.
Near Earth Asteroid Characteristics for Asteroid Threat Assessment
NASA Technical Reports Server (NTRS)
Dotson, Jessie
2015-01-01
Information about the physical characteristics of Near Earth Asteroids (NEAs) is needed to model behavior during atmospheric entry, to assess the risk of an impact, and to model possible mitigation techniques. The intrinsic properties of interest to entry and mitigation modelers, however, rarely are directly measureable. Instead we measure other properties and infer the intrinsic physical properties, so determining the complete set of characteristics of interest is far from straightforward. In addition, for the majority of NEAs, only the basic measurements exist so often properties must be inferred from statistics of the population of more completely characterized objects. We will provide an assessment of the current state of knowledge about the physical characteristics of importance to asteroid threat assessment. In addition, an ongoing effort to collate NEA characteristics into a readily accessible database for use by the planetary defense community will be discussed.
Mueller matrix mapping of biological polycrystalline layers using reference wave
NASA Astrophysics Data System (ADS)
Dubolazov, A.; Ushenko, O. G.; Ushenko, Yu. O.; Pidkamin, L. Y.; Sidor, M. I.; Grytsyuk, M.; Prysyazhnyuk, P. V.
2018-01-01
The paper consists of two parts. The first part is devoted to the short theoretical basics of the method of differential Mueller-matrix description of properties of partially depolarizing layers. It was provided the experimentally measured maps of differential matrix of the 1st order of polycrystalline structure of the histological section of brain tissue. It was defined the statistical moments of the 1st-4th orders, which characterize the distribution of matrix elements. In the second part of the paper it was provided the data of statistic analysis of birefringence and dichroism of the histological sections of mice liver tissue (normal and with diabetes). It were defined the objective criteria of differential diagnostics of diabetes.
Determining significant material properties: A discovery approach
NASA Technical Reports Server (NTRS)
Karplus, Alan K.
1992-01-01
The following is a laboratory experiment designed to further understanding of materials science. The experiment itself can be informative for persons of any age past elementary school, and even for some in elementary school. The preparation of the plastic samples is readily accomplished by persons with resonable dexterity in the cutting of paper designs. The completion of the statistical Design of Experiments, which uses Yates' Method, requires basic math (addition and subtraction). Interpretive work requires plotting of data and making observations. Knowledge of statistical methods would be helpful. The purpose of this experiment is to acquaint students with the seven classes of recyclable plastics, and provide hands-on learning about the response of these plastics to mechanical tensile loading.
NASA Astrophysics Data System (ADS)
Aligholi, Saeed; Lashkaripour, Gholam Reza; Ghafoori, Mohammad
2017-01-01
This paper sheds further light on the fundamental relationships between simple methods, rock strength, and brittleness of igneous rocks. In particular, the relationship between mechanical (point load strength index I s(50) and brittleness value S 20), basic physical (dry density and porosity), and dynamic properties (P-wave velocity and Schmidt rebound values) for a wide range of Iranian igneous rocks is investigated. First, 30 statistical models (including simple and multiple linear regression analyses) were built to identify the relationships between mechanical properties and simple methods. The results imply that rocks with different Schmidt hardness (SH) rebound values have different physicomechanical properties or relations. Second, using these results, it was proved that dry density, P-wave velocity, and SH rebound value provide a fine complement to mechanical properties classification of rock materials. Further, a detailed investigation was conducted on the relationships between mechanical and simple tests, which are established with limited ranges of P-wave velocity and dry density. The results show that strength values decrease with the SH rebound value. In addition, there is a systematic trend between dry density, P-wave velocity, rebound hardness, and brittleness value of the studied rocks, and rocks with medium hardness have a higher brittleness value. Finally, a strength classification chart and a brittleness classification table are presented, providing reliable and low-cost methods for the classification of igneous rocks.
Stochastic geometry in disordered systems, applications to quantum Hall transitions
NASA Astrophysics Data System (ADS)
Gruzberg, Ilya
2012-02-01
A spectacular success in the study of random fractal clusters and their boundaries in statistical mechanics systems at or near criticality using Schramm-Loewner Evolutions (SLE) naturally calls for extensions in various directions. Can this success be repeated for disordered and/or non-equilibrium systems? Naively, when one thinks about disordered systems and their average correlation functions one of the very basic assumptions of SLE, the so called domain Markov property, is lost. Also, in some lattice models of Anderson transitions (the network models) there are no natural clusters to consider. Nevertheless, in this talk I will argue that one can apply the so called conformal restriction, a notion of stochastic conformal geometry closely related to SLE, to study the integer quantum Hall transition and its variants. I will focus on the Chalker-Coddington network model and will demonstrate that its average transport properties can be mapped to a classical problem where the basic objects are geometric shapes (loosely speaking, the current paths) that obey an important restriction property. At the transition point this allows to use the theory of conformal restriction to derive exact expressions for point contact conductances in the presence of various non-trivial boundary conditions.
Teaching Basic Probability in Undergraduate Statistics or Management Science Courses
ERIC Educational Resources Information Center
Naidu, Jaideep T.; Sanford, John F.
2017-01-01
Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…
Progress in Turbulence Detection via GNSS Occultation Data
NASA Technical Reports Server (NTRS)
Cornman, L. B.; Goodrich, R. K.; Axelrad, P.; Barlow, E.
2012-01-01
The increased availability of radio occultation (RO) data offers the ability to detect and study turbulence in the Earth's atmosphere. An analysis of how RO data can be used to determine the strength and location of turbulent regions is presented. This includes the derivation of a model for the power spectrum of the log-amplitude and phase fluctuations of the permittivity (or index of refraction) field. The bulk of the paper is then concerned with the estimation of the model parameters. Parameter estimators are introduced and some of their statistical properties are studied. These estimators are then applied to simulated log-amplitude RO signals. This includes the analysis of global statistics derived from a large number of realizations, as well as case studies that illustrate various specific aspects of the problem. Improvements to the basic estimation methods are discussed, and their beneficial properties are illustrated. The estimation techniques are then applied to real occultation data. Only two cases are presented, but they illustrate some of the salient features inherent in real data.
NASA Astrophysics Data System (ADS)
Duarte-Cabral, A.; Acreman, D. M.; Dobbs, C. L.; Mottram, J. C.; Gibson, S. J.; Brunt, C. M.; Douglas, K. A.
2015-03-01
We present CO, H2, H I and HISA (H I self-absorption) distributions from a set of simulations of grand design spirals including stellar feedback, self-gravity, heating and cooling. We replicate the emission of the second galactic quadrant by placing the observer inside the modelled galaxies and post-process the simulations using a radiative transfer code, so as to create synthetic observations. We compare the synthetic data cubes to observations of the second quadrant of the Milky Way to test the ability of the current models to reproduce the basic chemistry of the Galactic interstellar medium (ISM), as well as to test how sensitive such galaxy models are to different recipes of chemistry and/or feedback. We find that models which include feedback and self-gravity can reproduce the production of CO with respect to H2 as observed in our Galaxy, as well as the distribution of the material perpendicular to the Galactic plane. While changes in the chemistry/feedback recipes do not have a huge impact on the statistical properties of the chemistry in the simulated galaxies, we find that the inclusion of both feedback and self-gravity are crucial ingredients, as our test without feedback failed to reproduce all of the observables. Finally, even though the transition from H2 to CO seems to be robust, we find that all models seem to underproduce molecular gas, and have a lower molecular to atomic gas fraction than is observed. Nevertheless, our fiducial model with feedback and self-gravity has shown to be robust in reproducing the statistical properties of the basic molecular gas components of the ISM in our Galaxy.
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less
Statistical characterization of thermal plumes in turbulent thermal convection
NASA Astrophysics Data System (ADS)
Zhou, Sheng-Qi; Xie, Yi-Chao; Sun, Chao; Xia, Ke-Qing
2016-09-01
We report an experimental study on the statistical properties of the thermal plumes in turbulent thermal convection. A method has been proposed to extract the basic characteristics of thermal plumes from temporal temperature measurement inside the convection cell. It has been found that both plume amplitude A and cap width w , in a time domain, are approximately in the log-normal distribution. In particular, the normalized most probable front width is found to be a characteristic scale of thermal plumes, which is much larger than the thermal boundary layer thickness. Over a wide range of the Rayleigh number, the statistical characterizations of the thermal fluctuations of plumes, and the turbulent background, the plume front width and plume spacing have been discussed and compared with the theoretical predictions and morphological observations. For the most part good agreements have been found with the direct observations.
Khadra, Ibrahim; Zhou, Zhou; Dunn, Claire; Wilson, Clive G; Halbert, Gavin
2015-01-25
A drug's solubility and dissolution behaviour within the gastrointestinal tract is a key property for successful administration by the oral route and one of the key factors in the biopharmaceutics classification system. This property can be determined by investigating drug solubility in human intestinal fluid (HIF) but this is difficult to obtain and highly variable, which has led to the development of multiple simulated intestinal fluid (SIF) recipes. Using a statistical design of experiment (DoE) technique this paper has investigated the effects and interactions on equilibrium drug solubility of seven typical SIF components (sodium taurocholate, lecithin, sodium phosphate, sodium chloride, pH, pancreatin and sodium oleate) within concentration ranges relevant to human intestinal fluid values. A range of poorly soluble drugs with acidic (naproxen, indomethacin, phenytoin, and piroxicam), basic (aprepitant, carvedilol, zafirlukast, tadalafil) or neutral (fenofibrate, griseofulvin, felodipine and probucol) properties have been investigated. The equilibrium solubility results determined are comparable with literature studies of the drugs in either HIF or SIF indicating that the DoE is operating in the correct space. With the exception of pancreatin, all of the factors individually had a statistically significant influence on equilibrium solubility with variations in magnitude of effect between the acidic and basic or neutral compounds and drug specific interactions were evident. Interestingly for the neutral compounds pH was the factor with the second largest solubility effect. Around one third of all the possible factor combinations showed a significant influence on equilibrium solubility with variations in interaction significance and magnitude of effect between the acidic and basic or neutral compounds. The least number of significant media component interactions were noted for the acidic compounds with three and the greatest for the neutral compounds at seven, with again drug specific effects evident. This indicates that a drug's equilibrium solubility in SIF is influenced depending upon drug type by between eight to fourteen individual or combinations of media components with some of these drug specific. This illustrates the complex nature of these fluids and provides for individual drugs a visualisation of the possible solubility envelope within the gastrointestinal tract, which may be of importance for modelling in vivo behaviour. In addition the results indicate that the design of experiment approach can be employed to provide greater detail of drug solubility behaviour, possible drug specific interactions and influence of variations in gastrointestinal media components due to disease. The approach is also feasible and amenable to adaptation for high throughput screening of drug candidates. Copyright © 2014 Elsevier B.V. All rights reserved.
Anandakrishnan, Ramu; Onufriev, Alexey
2008-03-01
In statistical mechanics, the equilibrium properties of a physical system of particles can be calculated as the statistical average over accessible microstates of the system. In general, these calculations are computationally intractable since they involve summations over an exponentially large number of microstates. Clustering algorithms are one of the methods used to numerically approximate these sums. The most basic clustering algorithms first sub-divide the system into a set of smaller subsets (clusters). Then, interactions between particles within each cluster are treated exactly, while all interactions between different clusters are ignored. These smaller clusters have far fewer microstates, making the summation over these microstates, tractable. These algorithms have been previously used for biomolecular computations, but remain relatively unexplored in this context. Presented here, is a theoretical analysis of the error and computational complexity for the two most basic clustering algorithms that were previously applied in the context of biomolecular electrostatics. We derive a tight, computationally inexpensive, error bound for the equilibrium state of a particle computed via these clustering algorithms. For some practical applications, it is the root mean square error, which can be significantly lower than the error bound, that may be more important. We how that there is a strong empirical relationship between error bound and root mean square error, suggesting that the error bound could be used as a computationally inexpensive metric for predicting the accuracy of clustering algorithms for practical applications. An example of error analysis for such an application-computation of average charge of ionizable amino-acids in proteins-is given, demonstrating that the clustering algorithm can be accurate enough for practical purposes.
ERIC Educational Resources Information Center
Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca
2016-01-01
Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…
Code of Federal Regulations, 2010 CFR
2010-07-01
... safety and environmental management policies for real property? 102-80.10 Section 102-80.10 Public... MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT General Provisions § 102-80.10 What are the basic safety and environmental management policies for real property? The basic safety and...
Water-resources investigations in Wisconsin, 1993
Maertz, D.E.
1993-01-01
OBJECTIVE: The objectives of this study are to provide continuous discharge records for selected rivers at specific sites to supply the needs for: regulation, analytical studies, definition of statistical properties, trends analysis, determination of the occurrence, and distribution of water in streams for planning. The project is also designed to determine lake levels and to provide discharge for floods, low-flow conditions, and for water-quality investigations. Requests for streamflow data and information relating to streamflow in Wisconsin are answered. Basic data are published annually in "Water Resources Data Wisconsin."
Differential 3D Mueller-matrix mapping of optically anisotropic depolarizing biological layers
NASA Astrophysics Data System (ADS)
Ushenko, O. G.; Grytsyuk, M.; Ushenko, V. O.; Bodnar, G. B.; Vanchulyak, O.; Meglinskiy, I.
2018-01-01
The paper consists of two parts. The first part is devoted to the short theoretical basics of the method of differential Mueller-matrix description of properties of partially depolarizing layers. It was provided the experimentally measured maps of differential matrix of the 2nd order of polycrystalline structure of the histological section of rectum wall tissue. It was defined the values of statistical moments of the1st-4th orders, which characterize the distribution of matrix elements. In the second part of the paper it was provided the data of statistic analysis of birefringence and dichroism of the histological sections of connecting component of vagina wall tissue (normal and with prolapse). It were defined the objective criteria of differential diagnostics of pathologies of vagina wall.
Colizza, Vittoria; Barrat, Alain; Barthélemy, Marc; Vespignani, Alessandro
2006-02-14
The systematic study of large-scale networks has unveiled the ubiquitous presence of connectivity patterns characterized by large-scale heterogeneities and unbounded statistical fluctuations. These features affect dramatically the behavior of the diffusion processes occurring on networks, determining the ensuing statistical properties of their evolution pattern and dynamics. In this article, we present a stochastic computational framework for the forecast of global epidemics that considers the complete worldwide air travel infrastructure complemented with census population data. We address two basic issues in global epidemic modeling: (i) we study the role of the large scale properties of the airline transportation network in determining the global diffusion pattern of emerging diseases; and (ii) we evaluate the reliability of forecasts and outbreak scenarios with respect to the intrinsic stochasticity of disease transmission and traffic flows. To address these issues we define a set of quantitative measures able to characterize the level of heterogeneity and predictability of the epidemic pattern. These measures may be used for the analysis of containment policies and epidemic risk assessment.
Statistical Analysis of Bus Networks in India
2016-01-01
In this paper, we model the bus networks of six major Indian cities as graphs in L-space, and evaluate their various statistical properties. While airline and railway networks have been extensively studied, a comprehensive study on the structure and growth of bus networks is lacking. In India, where bus transport plays an important role in day-to-day commutation, it is of significant interest to analyze its topological structure and answer basic questions on its evolution, growth, robustness and resiliency. Although the common feature of small-world property is observed, our analysis reveals a wide spectrum of network topologies arising due to significant variation in the degree-distribution patterns in the networks. We also observe that these networks although, robust and resilient to random attacks are particularly degree-sensitive. Unlike real-world networks, such as Internet, WWW and airline, that are virtual, bus networks are physically constrained. Our findings therefore, throw light on the evolution of such geographically and constrained networks that will help us in designing more efficient bus networks in the future. PMID:27992590
Interpretation of the results of statistical measurements. [search for basic probability model
NASA Technical Reports Server (NTRS)
Olshevskiy, V. V.
1973-01-01
For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.
Greene, Dina N; Schmidt, Robert L; Wilson, Andrew R; Freedman, Mark S; Grenache, David G
2012-08-01
Diagnosis of multiple sclerosis (MS) is facilitated by analyzing biochemical properties of cerebrospinal fluid (CSF). Oligoclonal bands (OCBs) and immunoglobulin G (IgG) index are well-established markers for evaluating patients suspected of having MS. Myelin basic protein (MBP) is also ordered frequently, but its usefulness remains questionable. OCB, IgG index, and MBP were measured in 16,690 consecutive CSF samples. Samples were divided into 2 groups based on MS status known (n = 71) or unknown (n = 16,118). Medical charts of the MS status known group were reviewed to determine their MS status. OCBs have a stronger association to IgG index results than does MBP. Importantly, MBP does not add a statistically significant increase in diagnostic sensitivity or specificity when used in combination with OCB and/or IgG index. The data indicate that MBP is an unnecessary and overused test.
Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung
2014-10-01
Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.
Foundations of radiation hydrodynamics
NASA Astrophysics Data System (ADS)
Mihalas, D.; Mihalas, B. W.
This book is the result of an attempt, over the past few years, to gather the basic tools required to do research on radiating flows in astrophysics. The microphysics of gases is discussed, taking into account the equation of state of a perfect gas, the first and second law of thermodynamics, the thermal properties of a perfect gas, the distribution function and Boltzmann's equation, the collision integral, the Maxwellian velocity distribution, Boltzmann's H-theorem, the time of relaxation, and aspects of classical statistical mechanics. Other subjects explored are related to the dynamics of ideal fluids, the dynamics of viscous and heat-conducting fluids, relativistic fluid flow, waves, shocks, winds, radiation and radiative transfer, the equations of radiation hydrodynamics, and radiating flows. Attention is given to small-amplitude disturbances, nonlinear flows, the interaction of radiation and matter, the solution of the transfer equation, acoustic waves, acoustic-gravity waves, basic concepts of special relativity, and equations of motion and energy.
Many roads to synchrony: natural time scales and their algorithms.
James, Ryan G; Mahoney, John R; Ellison, Christopher J; Crutchfield, James P
2014-04-01
We consider two important time scales-the Markov and cryptic orders-that monitor how an observer synchronizes to a finitary stochastic process. We show how to compute these orders exactly and that they are most efficiently calculated from the ε-machine, a process's minimal unifilar model. Surprisingly, though the Markov order is a basic concept from stochastic process theory, it is not a probabilistic property of a process. Rather, it is a topological property and, moreover, it is not computable from any finite-state model other than the ε-machine. Via an exhaustive survey, we close by demonstrating that infinite Markov and infinite cryptic orders are a dominant feature in the space of finite-memory processes. We draw out the roles played in statistical mechanical spin systems by these two complementary length scales.
41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?
Code of Federal Regulations, 2012 CFR
2012-01-01
... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What basic Art-in-Architecture policy governs Federal agencies? 102-77.10 Section 102-77.10 Public Contracts and Property... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy...
41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?
Code of Federal Regulations, 2014 CFR
2014-01-01
... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What basic Art-in-Architecture policy governs Federal agencies? 102-77.10 Section 102-77.10 Public Contracts and Property... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy...
41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What basic Art-in-Architecture policy governs Federal agencies? 102-77.10 Section 102-77.10 Public Contracts and Property... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy...
Hidden Statistics Approach to Quantum Simulations
NASA Technical Reports Server (NTRS)
Zak, Michail
2010-01-01
Recent advances in quantum information theory have inspired an explosion of interest in new quantum algorithms for solving hard computational (quantum and non-quantum) problems. The basic principle of quantum computation is that the quantum properties can be used to represent structure data, and that quantum mechanisms can be devised and built to perform operations with this data. Three basic non-classical properties of quantum mechanics superposition, entanglement, and direct-product decomposability were main reasons for optimism about capabilities of quantum computers that promised simultaneous processing of large massifs of highly correlated data. Unfortunately, these advantages of quantum mechanics came with a high price. One major problem is keeping the components of the computer in a coherent state, as the slightest interaction with the external world would cause the system to decohere. That is why the hardware implementation of a quantum computer is still unsolved. The basic idea of this work is to create a new kind of dynamical system that would preserve the main three properties of quantum physics superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods. In other words, such a system would reinforce the advantages and minimize limitations of both quantum and classical aspects. Based upon a concept of hidden statistics, a new kind of dynamical system for simulation of Schroedinger equation is proposed. The system represents a modified Madelung version of Schroedinger equation. It preserves superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods. Such an optimal combination of characteristics is a perfect match for simulating quantum systems. The model includes a transitional component of quantum potential (that has been overlooked in previous treatment of the Madelung equation). The role of the transitional potential is to provide a jump from a deterministic state to a random state with prescribed probability density. This jump is triggered by blowup instability due to violation of Lipschitz condition generated by the quantum potential. As a result, the dynamics attains quantum properties on a classical scale. The model can be implemented physically as an analog VLSI-based (very-large-scale integration-based) computer, or numerically on a digital computer. This work opens a way of developing fundamentally new algorithms for quantum simulations of exponentially complex problems that expand NASA capabilities in conducting space activities. It has been illustrated that the complexity of simulations of particle interaction can be reduced from an exponential one to a polynomial one.
Random bursts determine dynamics of active filaments.
Weber, Christoph A; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S; Bausch, Andreas R; Frey, Erwin
2015-08-25
Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system's dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model.
Random bursts determine dynamics of active filaments
Weber, Christoph A.; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S.; Bausch, Andreas R.; Frey, Erwin
2015-01-01
Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system’s dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model. PMID:26261319
ERIC Educational Resources Information Center
Zetterqvist, Lena
2017-01-01
Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…
A basic introduction to statistics for the orthopaedic surgeon.
Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef
2012-02-01
Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values.
Debecker, Damien P; Gaigneaux, Eric M; Busca, Guido
2009-01-01
Basic catalysis! The basic properties of hydrotalcites (see picture) make them attractive for numerous catalytic applications. Probing the basicity of the catalysts is crucial to understand the base-catalysed processes and to optimise the catalyst preparation. Various parameters can be employed to tune the basic properties of hydrotalcite-based catalysts towards the basicity demanded by each target chemical reaction.Hydrotalcites offer unique basic properties that make them very attractive for catalytic applications. It is of primary interest to make use of accurate tools for probing the basicity of hydrotalcite-based catalysts for the purpose of 1) fundamental understanding of base-catalysed processes with hydrotalcites and 2) optimisation of the catalytic performance achieved in reactions of industrial interest. Techniques based on probe molecules, titration techniques and test reactions along with physicochemical characterisation are overviewed in the first part of this review. The aim is to provide the tools for understanding how series of parameters involved in the preparation of hydrotalcite-based catalytic materials can be employed to control and adapt the basic properties of the catalyst towards the basicity demanded by each target chemical reaction. An overview of recent and significant achievements in that perspective is presented in the second part of the paper.
Maertz, D.E.
1992-01-01
OBJECTIVE: The objectives of this study are to provide continuous discharge records for selected rivers at specific sites to supply the needs for: regulation, analytical studies, definition of statistical properties, trends analysis, determination of the occurrence, and distribution of water in streams for planning. The project is also designed to determine lake levels and to provide discharge for floods, low-flow conditions, and for water-quality investigations. Requests for streamflow data and information relating to streamflow in Wisconsin are answered. Basic data are published annually in "Water Resources Data Wisconsin."
Water-resources investigations in Wisconsin
Maertz, D.E.
1996-01-01
OBJECTIVE: The objectives of this study are to provide continuous discharge records for selected rivers at specific sites to supply the needs for regulation, analytical studies, definition of statistical properties, trends analysis, determination of the occurrence, and distribution of water in streams for planning. The project is also LOCATION: Statewide PROJECT CHIEF: Barry K. Holmstrom PERIOD OF PROJECT: July 1913-Continuing designed to determine lake levels and to provide discharge for floods, low-flow conditions, and for waterquality investigations. Requests for streamflow data and information relating to streamflow in Wisconsin are answered. Basic data are published annually in the report "Water Resources Data-Wisconsin."
Measures of accuracy and performance of diagnostic tests.
Drobatz, Kenneth J
2009-05-01
Diagnostic tests are integral to the practice of veterinary cardiology, any other specialty, and general veterinary medicine. Developing and understanding diagnostic tests is one of the cornerstones of clinical research. This manuscript describes the diagnostic test properties including sensitivity, specificity, predictive value, likelihood ratio, receiver operating characteristic curve. Review of practical book chapters and standard statistics manuscripts. Diagnostics such as sensitivity, specificity, predictive value, likelihood ratio, and receiver operating characteristic curve are described and illustrated. Basic understanding of how diagnostic tests are developed and interpreted is essential in reviewing clinical scientific papers and understanding evidence based medicine.
Katayama, R; Sakai, S; Sakaguchi, T; Maeda, T; Takada, K; Hayabuchi, N; Morishita, J
2008-07-20
PURPOSE/AIM OF THE EXHIBIT: The purpose of this exhibit is: 1. To explain "resampling", an image data processing, performed by the digital radiographic system based on flat panel detector (FPD). 2. To show the influence of "resampling" on the basic imaging properties. 3. To present accurate measurement methods of the basic imaging properties of the FPD system. 1. The relationship between the matrix sizes of the output image and the image data acquired on FPD that automatically changes depending on a selected image size (FOV). 2. The explanation of the image data processing of "resampling". 3. The evaluation results of the basic imaging properties of the FPD system using two types of DICOM image to which "resampling" was performed: characteristic curves, presampled MTFs, noise power spectra, detective quantum efficiencies. CONCLUSION/SUMMARY: The major points of the exhibit are as follows: 1. The influence of "resampling" should not be disregarded in the evaluation of the basic imaging properties of the flat panel detector system. 2. It is necessary for the basic imaging properties to be measured by using DICOM image to which no "resampling" is performed.
Using Data Mining to Teach Applied Statistics and Correlation
ERIC Educational Resources Information Center
Hartnett, Jessica L.
2016-01-01
This article describes two class activities that introduce the concept of data mining and very basic data mining analyses. Assessment data suggest that students learned some of the conceptual basics of data mining, understood some of the ethical concerns related to the practice, and were able to perform correlations via the Statistical Package for…
Simple Data Sets for Distinct Basic Summary Statistics
ERIC Educational Resources Information Center
Lesser, Lawrence M.
2011-01-01
It is important to avoid ambiguity with numbers because unfortunate choices of numbers can inadvertently make it possible for students to form misconceptions or make it difficult for teachers to tell if students obtained the right answer for the right reason. Therefore, it is important to make sure when introducing basic summary statistics that…
Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation
NASA Astrophysics Data System (ADS)
Tchiguirinskaia, Ioulia; Scherzer, Daniel
2016-04-01
Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate a cyclone with the help of a 3D ordinary differential system. Being well supported by the extensive numerical results, the cascade generalisation of Euler's gyroscope equation opens new horizons for predictability and predictions of processes having long-range dependences.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What are the basic safety and environmental management policies for real property? 102-80.10 Section 102-80.10 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL...
NASA Astrophysics Data System (ADS)
Ma, Ning; Zhao, Juan; Hanson, Steen G.; Takeda, Mitsuo; Wang, Wei
2016-10-01
Laser speckle has received extensive studies of its basic properties and associated applications. In the majority of research on speckle phenomena, the random optical field has been treated as a scalar optical field, and the main interest has been concentrated on their statistical properties and applications of its intensity distribution. Recently, statistical properties of random electric vector fields referred to as Polarization Speckle have come to attract new interest because of their importance in a variety of areas with practical applications such as biomedical optics and optical metrology. Statistical phenomena of random electric vector fields have close relevance to the theories of speckles, polarization and coherence theory. In this paper, we investigate the correlation tensor for stochastic electromagnetic fields modulated by a depolarizer consisting of a rough-surfaced retardation plate. Under the assumption that the microstructure of the scattering surface on the depolarizer is as fine as to be unresolvable in our observation region, we have derived a relationship between the polarization matrix/coherency matrix for the modulated electric fields behind the rough-surfaced retardation plate and the coherence matrix under the free space geometry. This relation is regarded as entirely analogous to the van Cittert-Zernike theorem of classical coherence theory. Within the paraxial approximation as represented by the ABCD-matrix formalism, the three-dimensional structure of the generated polarization speckle is investigated based on the correlation tensor, indicating a typical carrot structure with a much longer axial dimension than the extent in its transverse dimension.
HyphArea--automated analysis of spatiotemporal fungal patterns.
Baum, Tobias; Navarro-Quezada, Aura; Knogge, Wolfgang; Douchkov, Dimitar; Schweizer, Patrick; Seiffert, Udo
2011-01-01
In phytopathology quantitative measurements are rarely used to assess crop plant disease symptoms. Instead, a qualitative valuation by eye is often the method of choice. In order to close the gap between subjective human inspection and objective quantitative results, the development of an automated analysis system that is capable of recognizing and characterizing the growth patterns of fungal hyphae in micrograph images was developed. This system should enable the efficient screening of different host-pathogen combinations (e.g., barley-Blumeria graminis, barley-Rhynchosporium secalis) using different microscopy technologies (e.g., bright field, fluorescence). An image segmentation algorithm was developed for gray-scale image data that achieved good results with several microscope imaging protocols. Furthermore, adaptability towards different host-pathogen systems was obtained by using a classification that is based on a genetic algorithm. The developed software system was named HyphArea, since the quantification of the area covered by a hyphal colony is the basic task and prerequisite for all further morphological and statistical analyses in this context. By means of a typical use case the utilization and basic properties of HyphArea could be demonstrated. It was possible to detect statistically significant differences between the growth of an R. secalis wild-type strain and a virulence mutant. Copyright © 2010 Elsevier GmbH. All rights reserved.
Basic statistics (the fundamental concepts).
Lim, Eric
2014-12-01
An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.
41 CFR 102-74.10 - What is the basic facility management policy?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What is the basic facility management policy? 102-74.10 Section 102-74.10 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 74-FACILITY...
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…
From Research to Practice: Basic Mathematics Skills and Success in Introductory Statistics
ERIC Educational Resources Information Center
Lunsford, M. Leigh; Poplin, Phillip
2011-01-01
Based on previous research of Johnson and Kuennen (2006), we conducted a study to determine factors that would possibly predict student success in an introductory statistics course. Our results were similar to Johnson and Kuennen in that we found students' basic mathematical skills, as measured on a test created by Johnson and Kuennen, were a…
Excitons in Single-Walled Carbon Nanotubes and Their Dynamics
NASA Astrophysics Data System (ADS)
Amori, Amanda R.; Hou, Zhentao; Krauss, Todd D.
2018-04-01
Understanding exciton dynamics in single-walled carbon nanotubes (SWCNTs) is essential to unlocking the many potential applications of these materials. This review summarizes recent progress in understanding exciton photophysics and, in particular, exciton dynamics in SWCNTs. We outline the basic physical and electronic properties of SWCNTs, as well as bright and dark transitions within the framework of a strongly bound one-dimensional excitonic model. We discuss the many facets of ultrafast carrier dynamics in SWCNTs, including both single-exciton states (bright and dark) and multiple-exciton states. Photophysical properties that directly relate to excitons and their dynamics, including exciton diffusion lengths, chemical and structural defects, environmental effects, and photoluminescence photon statistics as observed through photon antibunching measurements, are also discussed. Finally, we identify a few key areas for advancing further research in the field of SWCNT excitons and photonics.
Development of high temperature nickel-base alloys for jet engine turbine bucket applications
NASA Technical Reports Server (NTRS)
Quigg, R. J.; Scheirer, S. T.
1965-01-01
A program has been initiated to develop a material with superior properties at elevated temperatures for utilization in turbine blade applications. A nickel-base superalloy can provide the necessary high temperature strength by using the maximum capability of the three available strengthening mechanisms - intermetallic gamma prime precipitation (Ni3Al), solid solution strengthening with refractory and precious metals, and stable carbide formations through the addition of strong carbide forming elements. A stress rupture test at 2000 deg F and 15,000 psi was formulated to approximate the desired properties. By adding varying amounts of refractory metals (Mo, W and Ta) it was possible to statistically analyze the effects of each in a basic superalloy composition containing fixed amounts of Co, Cr, C, B, Sr, and Ni at three separate levels of AL and Ta. Metallographic analysis correlated with the mechanical properties of the alloys; those with few strengthening phases were weak and ductile and those with excessive amounts of intermetallic phases present in undesirable morphologies were brittle.
NASA Astrophysics Data System (ADS)
Wei, Yaochi; Kim, Seokpum; Horie, Yasuyuki; Zhou, Min
2017-06-01
A computational approach is developed to predict the probabilistic ignition thresholds of polymer-bonded explosives (PBXs). The simulations explicitly account for microstructure, constituent properties, and interfacial responses and capture processes responsible for the development of hotspots and damage. The specific damage mechanisms considered include viscoelasticity, viscoplasticity, fracture, post-fracture contact, frictional heating, and heat conduction. The probabilistic analysis uses sets of statistically similar microstructure samples to mimic relevant experiments for statistical variations of material behavior due to inherent material heterogeneities. The ignition thresholds and corresponding ignition probability maps are predicted for PBX 9404 and PBX 9501 for the impact loading regime of Up = 200 --1200 m/s. James and Walker-Wasley relations are utilized to establish explicit analytical expressions for the ignition probability as a function of load intensities. The predicted results are in good agreement with available experimental measurements. The capability to computationally predict the macroscopic response out of material microstructures and basic constituent properties lends itself to the design of new materials and the analysis of existing materials. The authors gratefully acknowledge the support from Air Force Office of Scientific Research (AFOSR) and the Defense Threat Reduction Agency (DTRA).
Verhagen, H; Aruoma, O I; van Delft, J H M; Dragsted, L O; Ferguson, L R; Knasmüller, S; Pool-Zobel, B L; Poulsen, H E; Williamson, G; Yannai, S
2003-05-01
There is increasing evidence that chemicals/test substances cannot only have adverse effects, but that there are many substances that can (also) have a beneficial effect on health. As this journal regularly publishes papers in this area and has every intention in continuing to do so in the near future, it has become essential that studies reported in this journal reflect an adequate level of scientific scrutiny. Therefore a set of essential characteristics of studies has been defined. These basic requirements are default properties rather than non-negotiables: deviations are possible and useful, provided they can be justified on scientific grounds. The 10 basic requirements for a scientific paper reporting antioxidant, antimutagenic or anticarcinogenic potential of test substances in in vitro experiments and animal studies in vivo concern the following areas: (1) Hypothesis-driven study design; (2) The nature of the test substance; (3) Valid and invalid test systems; (4) The selection of dose levels and gender; (5) Reversal of the effects induced by oxidants, carcinogens and mutagens; (6) Route of administration; (7) Number and validity of test variables; (8) Repeatability and reproducibility; (9) Statistics; and (10) Quality Assurance.
Simulation of sovereign CDS market based on interaction between market participant
NASA Astrophysics Data System (ADS)
Ko, Bonggyun; Kim, Kyungwon
2017-08-01
A research for distributional property of financial asset is the subject of intense interest not only for financial theory but also for practitioner. Such respect is no exception to CDS market. The CDS market, which began to receive attention since the global financial debacle, is not well researched despite of the importance of research necessity. This research introduces creation of CDS market and use Ising system utilizing occurrence characteristics (to shift risk) as an important factor. Therefore the results of this paper would be of great assistance to both financial theory and practice. From this study, not only distributional property of the CDS market but also various statistics like multifractal characteristics could promote understanding about the market. A salient point in this study is that countries are mainly clustering into 2 groups and it might be because of market situation and geographical characteristics of each country. This paper suggested 2 simulation parameters representing this market based on understanding such CDS market situation. The estimated parameters are suitable for high and low risk event of CDS market respectively and these two parameters are complementary and can cover not only basic statistics but also multifractal properties of most countries. Therefore these estimated parameters can be used in researches preparing for a certain event (high or low risk). Finally this research will serve as a momentum double-checking indirectly the performance of Ising system based on these results.
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
How do energetic ions damage metallic surfaces?
Osetskiy, Yury N.; Calder, Andrew F.; Stoller, Roger E.
2015-02-20
Surface modification under bombardment by energetic ions observed under different conditions in structural and functional materials and can be either unavoidable effect of the conditions or targeted modification to enhance materials properties. Understanding basic mechanisms is necessary for predicting properties changes. The mechanisms activated during ion irradiation are of atomic scale and atomic scale modeling is the most suitable tool to study these processes. In this paper we present results of an extensive simulation program aimed at developing an understanding of primary surface damage in iron by energetic particles. We simulated 25 keV self-ion bombardment of Fe thin films withmore » (100) and (110) surfaces at room temperature. A large number of simulations, ~400, were carried out allow a statistically significant treatment of the results. The particular mechanism of surface damage depends on how the destructive supersonic shock wave generated by the displacement cascade interacts with the free surface. Three basic scenarios were observed, with the limiting cases being damage created far below the surface with little or no impact on the surface itself, and extensive direct surface damage on the timescale of a few picoseconds. In some instances, formation of large <100> vacancy loops beneath the free surface was observed, which may explain some earlier experimental observations.« less
Effective constitutive relations for large repetitive frame-like structures
NASA Technical Reports Server (NTRS)
Nayfeh, A. H.; Hefzy, M. S.
1981-01-01
Effective mechanical properties for large repetitive framelike structures are derived using combinations of strength of material and orthogonal transformation techniques. Symmetry considerations are used in order to identify independent property constants. The actual values of these constants are constructed according to a building block format which is carried out in the three consecutive steps: (1) all basic planar lattices are identified; (2) effective continuum properties are derived for each of these planar basic grids using matrix structural analysis methods; and (3) orthogonal transformations are used to determine the contribution of each basic set to the overall effective continuum properties of the structure.
Statistical methods of estimating mining costs
Long, K.R.
2011-01-01
Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.
A Multidisciplinary Approach for Teaching Statistics and Probability
ERIC Educational Resources Information Center
Rao, C. Radhakrishna
1971-01-01
The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)
Applications of statistics to medical science (1) Fundamental concepts.
Watanabe, Hiroshi
2011-01-01
The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.
Density Functionals of Chemical Bonding
Putz, Mihai V.
2008-01-01
The behavior of electrons in general many-electronic systems throughout the density functionals of energy is reviewed. The basic physico-chemical concepts of density functional theory are employed to highlight the energy role in chemical structure while its extended influence in electronic localization function helps in chemical bonding understanding. In this context the energy functionals accompanied by electronic localization functions may provide a comprehensive description of the global-local levels electronic structures in general and of chemical bonds in special. Becke-Edgecombe and author’s Markovian electronic localization functions are discussed at atomic, molecular and solid state levels. Then, the analytical survey of the main workable kinetic, exchange, and correlation density functionals within local and gradient density approximations is undertaken. The hierarchy of various energy functionals is formulated by employing both the parabolic and statistical correlation degree of them with the electronegativity and chemical hardness indices by means of quantitative structure-property relationship (QSPR) analysis for basic atomic and molecular systems. PMID:19325846
NASA Technical Reports Server (NTRS)
Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.
2010-01-01
The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Chien-Hsiu, E-mail: leech@naoj.org
Eclipsing binaries offer a unique opportunity to determine basic stellar properties. With the advent of wide-field camera and all-sky time-domain surveys, thousands of eclipsing binaries have been charted via light curve classification, yet their fundamental properties remain unexplored mainly due to the extensive efforts needed for spectroscopic follow-ups. In this paper, we present the discovery of a short-period ( P = 0.313 day), double-lined M-dwarf eclipsing binary, CSSJ114804.3+255132/SDSSJ114804.35+255132.6, by cross-matching binary light curves from the Catalina Sky Survey and spectroscopically classified M dwarfs from the Sloan Digital Sky Survey. We obtain follow-up spectra using the Gemini telescope, enabling us to determinemore » the mass, radius, and temperature of the primary and secondary component to be M {sub 1} = 0.47 ± 0.03(statistic) ± 0.03(systematic) M {sub ⊙}, M {sub 2} = 0.46 ± 0.03(statistic) ± 0.03(systematic) M {sub ⊙}, R {sub 1} = 0.52 ± 0.08(statistic) ± 0.07(systematic) R {sub ⊙}, R {sub 2} =0.60 ± 0.08(statistic) ± 0.08(systematic) R {sub ⊙}, T {sub 1} = 3560 ± 100 K, and T {sub 2} = 3040 ± 100 K, respectively. The systematic error was estimated using the difference between eccentric and non-eccentric fits. Our analysis also indicates that there is definitively third-light contamination (66%) in the CSS photometry. The secondary star seems inflated, probably due to tidal locking of the close secondary companion, which is common for very short-period binary systems. Future spectroscopic observations with high resolution will narrow down the uncertainties of stellar parameters for both components, rendering this system as a benchmark for studying fundamental properties of M dwarfs.« less
ERIC Educational Resources Information Center
Ragasa, Carmelita Y.
2008-01-01
The objective of the study is to determine if there is a significant difference in the effects of the treatment and control groups on achievement as well as on attitude as measured by the posttest. A class of 38 sophomore college students in the basic statistics taught with the use of computer-assisted instruction and another class of 15 students…
Back to basics: an introduction to statistics.
Halfens, R J G; Meijers, J M M
2013-05-01
In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.
41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?
Code of Federal Regulations, 2011 CFR
2011-01-01
... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What basic Art-in... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy governs Federal agencies? Federal agencies must incorporate fine arts as an integral part of the total...
41 CFR 102-77.10 - What basic Art-in-Architecture policy governs Federal agencies?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What basic Art-in... PROPERTY 77-ART-IN-ARCHITECTURE General Provisions § 102-77.10 What basic Art-in-Architecture policy governs Federal agencies? Federal agencies must incorporate fine arts as an integral part of the total...
41 CFR 102-76.10 - What basic design and construction policy governs Federal agencies?
Code of Federal Regulations, 2011 CFR
2011-01-01
... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What basic design and construction policy governs Federal agencies? 102-76.10 Section 102-76.10 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL...
The Faces in Infant-Perspective Scenes Change over the First Year of Life
Jayaraman, Swapnaa; Fausey, Caitlin M.; Smith, Linda B.
2015-01-01
Mature face perception has its origins in the face experiences of infants. However, little is known about the basic statistics of faces in early visual environments. We used head cameras to capture and analyze over 72,000 infant-perspective scenes from 22 infants aged 1-11 months as they engaged in daily activities. The frequency of faces in these scenes declined markedly with age: for the youngest infants, faces were present 15 minutes in every waking hour but only 5 minutes for the oldest infants. In general, the available faces were well characterized by three properties: (1) they belonged to relatively few individuals; (2) they were close and visually large; and (3) they presented views showing both eyes. These three properties most strongly characterized the face corpora of our youngest infants and constitute environmental constraints on the early development of the visual system. PMID:26016988
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
ERIC Educational Resources Information Center
Haas, Stephanie W.; Pattuelli, Maria Cristina; Brown, Ron T.
2003-01-01
Describes the Statistical Interactive Glossary (SIG), an enhanced glossary of statistical terms supported by the GovStat ontology of statistical concepts. Presents a conceptual framework whose components articulate different aspects of a term's basic explanation that can be manipulated to produce a variety of presentations. The overarching…
ERIC Educational Resources Information Center
Montana State Univ., Bozeman. Dept. of Agricultural and Industrial Education.
This curriculum guide is designed for use in teaching a course in basic soils that is intended for college freshmen. Addressed in the individual lessons of the unit are the following topics: the way in which soil is formed, the physical properties of soil, the chemical properties of soil, the biotic properties of soil, plant-soil-water…
41 CFR 102-78.10 - What basic historic preservation policy governs Federal agencies?
Code of Federal Regulations, 2014 CFR
2014-01-01
... governs Federal agencies? To protect, enhance and preserve historic and cultural property under their... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What basic historic preservation policy governs Federal agencies? 102-78.10 Section 102-78.10 Public Contracts and Property...
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Khalil, T T; Boulanouar, O; Heintz, O; Fromm, M
2017-02-01
We have investigated the ability of diamines as well as basic amino acids to condense DNA onto highly ordered pyrolytic graphite with minimum damage after re-dissolution in water. Based on a bibliographic survey we briefly summarize DNA binding properties with diamines as compared to basic amino acids. Thus, solutions of DNA complexed with these linkers were drop-cast in order to deposit ultra-thin layers on the surface of HOPG in the absence or presence of Tris buffer. Atomic Force Microscopy analyses showed that, at a fixed ligand-DNA mixing ratio of 16, the mean thickness of the layers can be statistically predicted to lie in the range 0-50nm with a maximum standard deviation ±6nm, using a simple linear law depending on the DNA concentration. The morphology of the layers appears to be ligand-dependent. While the layers containing diamines present holes, those formed in the presence of basic amino acids, except for lysine, are much more compact and dense. X-ray Photoelectron Spectroscopy measurements provide compositional information indicating that, compared to the maximum number of DNA sites to which the ligands may bind, the basic amino acids Arg and His are present in large excess. Conservation of the supercoiled topology of the DNA plasmids was studied after recovery of the complex layers in water. Remarkably, arginine has the best protection capabilities whether Tris was present or not in the initial solution. Copyright © 2016 Elsevier B.V. All rights reserved.
Fish: A New Computer Program for Friendly Introductory Statistics Help
ERIC Educational Resources Information Center
Brooks, Gordon P.; Raffle, Holly
2005-01-01
All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…
NASA Astrophysics Data System (ADS)
Stück, H. L.; Siegesmund, S.
2012-04-01
Sandstones are a popular natural stone due to their wide occurrence and availability. The different applications for these stones have led to an increase in demand. From the viewpoint of conservation and the natural stone industry, an understanding of the material behaviour of this construction material is very important. Sandstones are a highly heterogeneous material. Based on statistical analyses with a sufficiently large dataset, a systematic approach to predicting the material behaviour should be possible. Since the literature already contains a large volume of data concerning the petrographical and petrophysical properties of sandstones, a large dataset could be compiled for the statistical analyses. The aim of this study is to develop constraints on the material behaviour and especially on the weathering behaviour of sandstones. Approximately 300 samples from historical and presently mined natural sandstones in Germany and ones described worldwide were included in the statistical approach. The mineralogical composition and fabric characteristics were determined from detailed thin section analyses and descriptions in the literature. Particular attention was paid to evaluating the compositional and textural maturity, grain contact respectively contact thickness, type of cement, degree of alteration and the intergranular volume. Statistical methods were used to test for normal distributions and calculating the linear regression of the basic petrophysical properties of density, porosity, water uptake as well as the strength. The sandstones were classified into three different pore size distributions and evaluated with the other petrophysical properties. Weathering behavior like hygric swelling and salt loading tests were also included. To identify similarities between individual sandstones or to define groups of specific sandstone types, principle component analysis, cluster analysis and factor analysis were applied. Our results show that composition and porosity evolution during diagenesis is a very important control on the petrophysical properties of a building stone. The relationship between intergranular volume, cementation and grain contact, can also provide valuable information to predict the strength properties. Since the samples investigated mainly originate from the Triassic German epicontinental basin, arkoses and feldspar-arenites are underrepresented. In general, the sandstones can be grouped as follows: i) quartzites, highly mature with a primary porosity of about 40%, ii) quartzites, highly mature, showing a primary porosity of 40% but with early clay infiltration, iii) sublitharenites-lithic arenites exhibiting a lower primary porosity, higher cementation with quartz and Fe-oxides ferritic and iv) sublitharenites-lithic arenites with a higher content of pseudomatrix. However, in the last two groups the feldspar and lithoclasts can also show considerable alteration. All sandstone groups differ with respect to the pore space and strength data, as well as water uptake properties, which were obtained by linear regression analysis. Similar petrophysical properties are discernible for each type when using principle component analysis. Furthermore, strength as well as the porosity of sandstones shows distinct differences considering their stratigraphic ages and the compositions. The relationship between porosity, strength as well as salt resistance could also be verified. Hygric swelling shows an interrelation to pore size type, porosity and strength but also to the degree of alteration (e.g. lithoclasts, pseudomatrix). To summarize, the different regression analyses and the calculated confidence regions provide a significant tool to classify the petrographical and petrophysical parameters of sandstones. Based on this, the durability and the weathering behavior of the sandstone groups can be constrained. Keywords: sandstones, petrographical & petrophysical properties, predictive approach, statistical investigation
Statistical inference of the generation probability of T-cell receptors from sequence repertoires.
Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G
2012-10-02
Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.
CADDIS Volume 4. Data Analysis: Basic Principles & Issues
Use of inferential statistics in causal analysis, introduction to data independence and autocorrelation, methods to identifying and control for confounding variables, references for the Basic Principles section of Data Analysis.
Are We Able to Pass the Mission of Statistics to Students?
ERIC Educational Resources Information Center
Hindls, Richard; Hronová, Stanislava
2015-01-01
The article illustrates our long term experience in teaching statistics for non-statisticians, especially for students of economics and humanities. The article is focused on some problems of the basic course that can weaken the interest in statistics or lead to false use of statistic methods.
The structural properties of PbF2 by molecular dynamics
NASA Astrophysics Data System (ADS)
Chergui, Y.; Nehaoua, N.; Telghemti, B.; Guemid, S.; Deraddji, N. E.; Belkhir, H.; Mekki, D. E.
2010-08-01
This work presents the use of molecular dynamics (MD) and the code of Dl_Poly, in order to study the structure of fluoride glass after melting and quenching. We are realized the processing phase liquid-phase, simulating rapid quenching at different speeds to see the effect of quenching rate on the operation of the devitrification. This technique of simulation has become a powerful tool for investigating the microscopic behaviour of matter as well as for calculating macroscopic observable quantities. As basic results, we calculated the interatomic distance, angles and statistics, which help us to know the geometric form and the structure of PbF2. These results are in experimental agreement to those reported in literature.
Research progress on expansive soil cracks under changing environment.
Shi, Bei-xiao; Zheng, Cheng-feng; Wu, Jin-kun
2014-01-01
Engineering problems shunned previously rise to the surface gradually with the activities of reforming the natural world in depth, the problem of expansive soil crack under the changing environment becoming a control factor of expansive soil slope stability. The problem of expansive soil crack has gradually become a research hotspot, elaborates the occurrence and development of cracks from the basic properties of expansive soil, and points out the role of controlling the crack of expansive soil strength. We summarize the existing research methods and results of expansive soil crack characteristics. Improving crack measurement and calculation method and researching the crack depth measurement, statistical analysis method, crack depth and surface feature relationship will be the future direction.
Generalised Central Limit Theorems for Growth Rate Distribution of Complex Systems
NASA Astrophysics Data System (ADS)
Takayasu, Misako; Watanabe, Hayafumi; Takayasu, Hideki
2014-04-01
We introduce a solvable model of randomly growing systems consisting of many independent subunits. Scaling relations and growth rate distributions in the limit of infinite subunits are analysed theoretically. Various types of scaling properties and distributions reported for growth rates of complex systems in a variety of fields can be derived from this basic physical model. Statistical data of growth rates for about 1 million business firms are analysed as a real-world example of randomly growing systems. Not only are the scaling relations consistent with the theoretical solution, but the entire functional form of the growth rate distribution is fitted with a theoretical distribution that has a power-law tail.
The uniform quantized electron gas revisited
NASA Astrophysics Data System (ADS)
Lomba, Enrique; Høye, Johan S.
2017-11-01
In this article we continue and extend our recent work on the correlation energy of the quantized electron gas of uniform density at temperature T=0 . As before, we utilize the methods, properties, and results obtained by means of classical statistical mechanics. These were extended to quantized systems via the Feynman path integral formalism. The latter translates the quantum problem into a classical polymer problem in four dimensions. Again, the well known RPA (random phase approximation) is recovered as a basic result which we then modify and improve upon. Here we analyze the condition of thermodynamic self-consistency. Our numerical calculations exhibit a remarkable agreement with well known results of a standard parameterization of Monte Carlo correlation energies.
Appplication of statistical mechanical methods to the modeling of social networks
NASA Astrophysics Data System (ADS)
Strathman, Anthony Robert
With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.
PET image reconstruction: a robust state space approach.
Liu, Huafeng; Tian, Yi; Shi, Pengcheng
2005-01-01
Statistical iterative reconstruction algorithms have shown improved image quality over conventional nonstatistical methods in PET by using accurate system response models and measurement noise models. Strictly speaking, however, PET measurements, pre-corrected for accidental coincidences, are neither Poisson nor Gaussian distributed and thus do not meet basic assumptions of these algorithms. In addition, the difficulty in determining the proper system response model also greatly affects the quality of the reconstructed images. In this paper, we explore the usage of state space principles for the estimation of activity map in tomographic PET imaging. The proposed strategy formulates the organ activity distribution through tracer kinetics models, and the photon-counting measurements through observation equations, thus makes it possible to unify the dynamic reconstruction problem and static reconstruction problem into a general framework. Further, it coherently treats the uncertainties of the statistical model of the imaging system and the noisy nature of measurement data. Since H(infinity) filter seeks minimummaximum-error estimates without any assumptions on the system and data noise statistics, it is particular suited for PET image reconstruction where the statistical properties of measurement data and the system model are very complicated. The performance of the proposed framework is evaluated using Shepp-Logan simulated phantom data and real phantom data with favorable results.
Universal sequence map (USM) of arbitrary discrete sequences
2002-01-01
Background For over a decade the idea of representing biological sequences in a continuous coordinate space has maintained its appeal but not been fully realized. The basic idea is that any sequence of symbols may define trajectories in the continuous space conserving all its statistical properties. Ideally, such a representation would allow scale independent sequence analysis – without the context of fixed memory length. A simple example would consist on being able to infer the homology between two sequences solely by comparing the coordinates of any two homologous units. Results We have successfully identified such an iterative function for bijective mappingψ of discrete sequences into objects of continuous state space that enable scale-independent sequence analysis. The technique, named Universal Sequence Mapping (USM), is applicable to sequences with an arbitrary length and arbitrary number of unique units and generates a representation where map distance estimates sequence similarity. The novel USM procedure is based on earlier work by these and other authors on the properties of Chaos Game Representation (CGR). The latter enables the representation of 4 unit type sequences (like DNA) as an order free Markov Chain transition table. The properties of USM are illustrated with test data and can be verified for other data by using the accompanying web-based tool:http://bioinformatics.musc.edu/~jonas/usm/. Conclusions USM is shown to enable a statistical mechanics approach to sequence analysis. The scale independent representation frees sequence analysis from the need to assume a memory length in the investigation of syntactic rules. PMID:11895567
NASA Astrophysics Data System (ADS)
Kim, Seokpum; Wei, Yaochi; Horie, Yasuyuki; Zhou, Min
2018-05-01
The design of new materials requires establishment of macroscopic measures of material performance as functions of microstructure. Traditionally, this process has been an empirical endeavor. An approach to computationally predict the probabilistic ignition thresholds of polymer-bonded explosives (PBXs) using mesoscale simulations is developed. The simulations explicitly account for microstructure, constituent properties, and interfacial responses and capture processes responsible for the development of hotspots and damage. The specific mechanisms tracked include viscoelasticity, viscoplasticity, fracture, post-fracture contact, frictional heating, and heat conduction. The probabilistic analysis uses sets of statistically similar microstructure samples to directly mimic relevant experiments for quantification of statistical variations of material behavior due to inherent material heterogeneities. The particular thresholds and ignition probabilities predicted are expressed in James type and Walker-Wasley type relations, leading to the establishment of explicit analytical expressions for the ignition probability as function of loading. Specifically, the ignition thresholds corresponding to any given level of ignition probability and ignition probability maps are predicted for PBX 9404 for the loading regime of Up = 200-1200 m/s where Up is the particle speed. The predicted results are in good agreement with available experimental measurements. A parametric study also shows that binder properties can significantly affect the macroscopic ignition behavior of PBXs. The capability to computationally predict the macroscopic engineering material response relations out of material microstructures and basic constituent and interfacial properties lends itself to the design of new materials as well as the analysis of existing materials.
Universal biology and the statistical mechanics of early life.
Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid
2017-12-28
All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).
Universal biology and the statistical mechanics of early life
NASA Astrophysics Data System (ADS)
Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid
2017-11-01
All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media. This article is part of the themed issue 'Reconceptualizing the origins of life'.
Lee, Seul Gi; Shin, Yun Hee
2016-04-01
This study was done to verify effects of a self-directed feedback practice using smartphone videos on nursing students' basic nursing skills, confidence in performance and learning satisfaction. In this study an experimental study with a post-test only control group design was used. Twenty-nine students were assigned to the experimental group and 29 to the control group. Experimental treatment was exchanging feedback on deficiencies through smartphone recorded videos of nursing practice process taken by peers during self-directed practice. Basic nursing skills scores were higher for all items in the experimental group compared to the control group, and differences were statistically significant ["Measuring vital signs" (t=-2.10, p=.039); "Wearing protective equipment when entering and exiting the quarantine room and the management of waste materials" (t=-4.74, p<.001) "Gavage tube feeding" (t=-2.70, p=.009)]. Confidence in performance was higher in the experimental group compared to the control group, but the differences were not statistically significant. However, after the complete practice, there was a statistically significant difference in overall performance confidence (t=-3.07. p=.003). Learning satisfaction was higher in the experimental group compared to the control group, but the difference was not statistically significant (t=-1.67, p=.100). Results of this study indicate that self-directed feedback practice using smartphone videos can improve basic nursing skills. The significance is that it can help nursing students gain confidence in their nursing skills for the future through improvement of basic nursing skills and performance of quality care, thus providing patients with safer care.
Directly observable optical properties of sprites in Central Europe
NASA Astrophysics Data System (ADS)
Bór, József
2013-04-01
Luminous optical emissions accompanying streamer-based natural electric breakdown processes initiating in the mesosphere are called sprites. 489 sprite events have been observed with a TV frame rate video system in Central Europe from Sopron (47.68N, 16.58E, 230 m MSL), Hungary between 2007 and 2009. On the basis of these observations, characteristic morphological properties of sprites, i.e. basic forms (e.g. column, carrot, angel, etc.) as well as common morphological features (e.g. tendrils, glows, puffs, beads, etc.), have been identified. Probable time sequences of streamer propagation directions were associated with each of the basic sprite forms. It is speculated that different sequences of streamer propagation directions can result in very similar final sprite shapes. The number and type variety of sprite elements appearing in an event as well as the total optical duration of an event was analyzed statistically. Jellyfish and dancing sprite events were considered as special subsets of sprite clusters. It was found that more than 90% of the recorded sprite elements appeared in clusters rather than alone and more than half of the clusters contained more than one basic sprite forms. The analysis showed that jellyfish sprites and clusters of column sprites featuring glows and tendrils do not tend to have optical lifetimes longer than 80 ms. Such very long optical lifetimes have not been observed in sprite clusters containing more than 25 elements of any type, either. In contrast to clusters containing sprite entities of only one form, sprite events showing more sprite forms seem to have extended optical durations more likely. The need for further investigation and for finding theoretical concepts to link these observations to electric conditions ambient for sprite formation is emphasized.
Single photon sources with single semiconductor quantum dots
NASA Astrophysics Data System (ADS)
Shan, Guang-Cun; Yin, Zhang-Qi; Shek, Chan Hung; Huang, Wei
2014-04-01
In this contribution, we briefly recall the basic concepts of quantum optics and properties of semiconductor quantum dot (QD) which are necessary to the understanding of the physics of single-photon generation with single QDs. Firstly, we address the theory of quantum emitter-cavity system, the fluorescence and optical properties of semiconductor QDs, and the photon statistics as well as optical properties of the QDs. We then review the localization of single semiconductor QDs in quantum confined optical microcavity systems to achieve their overall optical properties and performances in terms of strong coupling regime, efficiency, directionality, and polarization control. Furthermore, we will discuss the recent progress on the fabrication of single photon sources, and various approaches for embedding single QDs into microcavities or photonic crystal nanocavities and show how to extend the wavelength range. We focus in particular on new generations of electrically driven QD single photon source leading to high repetition rates, strong coupling regime, and high collection efficiencies at elevated temperature operation. Besides, new developments of room temperature single photon emission in the strong coupling regime are reviewed. The generation of indistinguishable photons and remaining challenges for practical single-photon sources are also discussed.
Provision of Pre-Primary Education as a Basic Right in Tanzania: Reflections from Policy Documents
ERIC Educational Resources Information Center
Mtahabwa, Lyabwene
2010-01-01
This study sought to assess provision of pre-primary education in Tanzania as a basic right through analyses of relevant policy documents. Documents which were published over the past decade were considered, including educational policies, action plans, national papers, the "Basic Education Statistics in Tanzania" documents, strategy…
Sormaz, Mladen; Watson, David M; Smith, William A P; Young, Andrew W; Andrews, Timothy J
2016-04-01
The ability to perceive facial expressions of emotion is essential for effective social communication. We investigated how the perception of facial expression emerges from the image properties that convey this important social signal, and how neural responses in face-selective brain regions might track these properties. To do this, we measured the perceptual similarity between expressions of basic emotions, and investigated how this is reflected in image measures and in the neural response of different face-selective regions. We show that the perceptual similarity of different facial expressions (fear, anger, disgust, sadness, happiness) can be predicted by both surface and feature shape information in the image. Using block design fMRI, we found that the perceptual similarity of expressions could also be predicted from the patterns of neural response in the face-selective posterior superior temporal sulcus (STS), but not in the fusiform face area (FFA). These results show that the perception of facial expression is dependent on the shape and surface properties of the image and on the activity of specific face-selective regions. Copyright © 2016 Elsevier Inc. All rights reserved.
Test bench for measurements of NOvA scintillator properties at JINR
NASA Astrophysics Data System (ADS)
Velikanova, D. S.; Antoshkin, A. I.; Anfimov, N. V.; Samoylov, O. B.
2018-04-01
The NOvA experiment was built to study oscillation parameters, mass hierarchy, CP- violation phase in the lepton sector and θ23 octant, via vɛ appearance and vμ disappearance modes in both neutrino and antineutrino beams. These scientific goals require good knowledge about NOvA scintillator basic properties. The new test bench was constructed and upgraded at JINR. The main goal of this bench is to measure scintillator properties (for solid and liquid scintillators), namely α/β discrimination and Birk's coefficients for protons and other hadrons (quenching factors). This knowledge will be crucial for recovering the energy of the hadronic part of neutrino interactions with scintillator nuclei. α/β discrimination was performed on the first version of the bench for LAB-based and NOvA scintillators. It was performed again on the upgraded version of the bench with higher statistic and precision level. Preliminary result of quenching factors for protons was obtained. A technical description of both versions of the bench and current results of the measurements and analysis are presented in this work.
A crash course on data analysis in asteroseismology
NASA Astrophysics Data System (ADS)
Appourchaux, Thierry
2014-02-01
In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.
Reis, Matthias; Kromer, Justus A; Klipp, Edda
2018-01-20
Multimodality is a phenomenon which complicates the analysis of statistical data based exclusively on mean and variance. Here, we present criteria for multimodality in hierarchic first-order reaction networks, consisting of catalytic and splitting reactions. Those networks are characterized by independent and dependent subnetworks. First, we prove the general solvability of the Chemical Master Equation (CME) for this type of reaction network and thereby extend the class of solvable CME's. Our general solution is analytical in the sense that it allows for a detailed analysis of its statistical properties. Given Poisson/deterministic initial conditions, we then prove the independent species to be Poisson/binomially distributed, while the dependent species exhibit generalized Poisson/Khatri Type B distributions. Generalized Poisson/Khatri Type B distributions are multimodal for an appropriate choice of parameters. We illustrate our criteria for multimodality by several basic models, as well as the well-known two-stage transcription-translation network and Bateman's model from nuclear physics. For both examples, multimodality was previously not reported.
NASA Astrophysics Data System (ADS)
Casas-Castillo, M. Carmen; Rodríguez-Solà, Raúl; Navarro, Xavier; Russo, Beniamino; Lastra, Antonio; González, Paula; Redaño, Angel
2018-01-01
The fractal behavior of extreme rainfall intensities registered between 1940 and 2012 by the Retiro Observatory of Madrid (Spain) has been examined, and a simple scaling regime ranging from 25 min to 3 days of duration has been identified. Thus, an intensity-duration-frequency (IDF) master equation of the location has been constructed in terms of the simple scaling formulation. The scaling behavior of probable maximum precipitation (PMP) for durations between 5 min and 24 h has also been verified. For the statistical estimation of the PMP, an envelope curve of the frequency factor ( k m ) based on a total of 10,194 station-years of annual maximum rainfall from 258 stations in Spain has been developed. This curve could be useful to estimate suitable values of PMP at any point of the Iberian Peninsula from basic statistical parameters (mean and standard deviation) of its rainfall series. [Figure not available: see fulltext.
On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics
NASA Astrophysics Data System (ADS)
Busch, Paul; Quadt, Ralf
1990-10-01
Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.
[Flavouring estimation of quality of grape wines with use of methods of mathematical statistics].
Yakuba, Yu F; Khalaphyan, A A; Temerdashev, Z A; Bessonov, V V; Malinkin, A D
2016-01-01
The questions of forming of wine's flavour integral estimation during the tasting are discussed, the advantages and disadvantages of the procedures are declared. As investigating materials we used the natural white and red wines of Russian manufactures, which were made with the traditional technologies from Vitis Vinifera, straight hybrids, blending and experimental wines (more than 300 different samples). The aim of the research was to set the correlation between the content of wine's nonvolatile matter and wine's tasting quality rating by mathematical statistics methods. The content of organic acids, amino acids and cations in wines were considered as the main factors influencing on the flavor. Basically, they define the beverage's quality. The determination of those components in wine's samples was done by the electrophoretic method «CAPEL». Together with the analytical checking of wine's samples quality the representative group of specialists simultaneously carried out wine's tasting estimation using 100 scores system. The possibility of statistical modelling of correlation of wine's tasting estimation based on analytical data of amino acids and cations determination reasonably describing the wine's flavour was examined. The statistical modelling of correlation between the wine's tasting estimation and the content of major cations (ammonium, potassium, sodium, magnesium, calcium), free amino acids (proline, threonine, arginine) and the taking into account the level of influence on flavour and analytical valuation within fixed limits of quality accordance were done with Statistica. Adequate statistical models which are able to predict tasting estimation that is to determine the wine's quality using the content of components forming the flavour properties have been constructed. It is emphasized that along with aromatic (volatile) substances the nonvolatile matter - mineral substances and organic substances - amino acids such as proline, threonine, arginine influence on wine's flavour properties. It has been shown the nonvolatile components contribute in organoleptic and flavour quality estimation of wines as aromatic volatile substances but they take part in forming the expert's evaluation.
NASA Astrophysics Data System (ADS)
Anishchenko, V. S.; Boev, Ya. I.; Semenova, N. I.; Strelkova, G. I.
2015-07-01
We review rigorous and numerical results on the statistics of Poincaré recurrences which are related to the modern development of the Poincaré recurrence problem. We analyze and describe the rigorous results which are achieved both in the classical (local) approach and in the recently developed global approach. These results are illustrated by numerical simulation data for simple chaotic and ergodic systems. It is shown that the basic theoretical laws can be applied to noisy systems if the probability measure is ergodic and stationary. Poincaré recurrences are studied numerically in nonautonomous systems. Statistical characteristics of recurrences are analyzed in the framework of the global approach for the cases of positive and zero topological entropy. We show that for the positive entropy, there is a relationship between the Afraimovich-Pesin dimension, Lyapunov exponents and the Kolmogorov-Sinai entropy either without and in the presence of external noise. The case of zero topological entropy is exemplified by numerical results for the Poincare recurrence statistics in the circle map. We show and prove that the dependence of minimal recurrence times on the return region size demonstrates universal properties for the golden and the silver ratio. The behavior of Poincaré recurrences is analyzed at the critical point of Feigenbaum attractor birth. We explore Poincaré recurrences for an ergodic set which is generated in the stroboscopic section of a nonautonomous oscillator and is similar to a circle shift. Based on the obtained results we show how the Poincaré recurrence statistics can be applied for solving a number of nonlinear dynamics issues. We propose and illustrate alternative methods for diagnosing effects of external and mutual synchronization of chaotic systems in the context of the local and global approaches. The properties of the recurrence time probability density can be used to detect the stochastic resonance phenomenon. We also discuss how the fractal dimension of chaotic attractors can be estimated using the Poincaré recurrence statistics.
Transparent Conducting Oxides: Status and Opportunities in Basic Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coutts, T. J.; Perkins, J. D.; Ginley, D.S.
1999-08-01
In this paper, we begin by discussing the historical background of transparent conducting oxides and then make some general remarks about their typical properties. This is followed by a short discussion of the desired properties for future applications (particularly photovoltaic devices). These are ambitious objectives but they provide targets for future basic research and development. Although it may be possible to obtain these properties in the laboratory, it is vital to ensure that account is taken of industrial perceptions to the development of the next generation of materials. Hence, we spend some time discussing industrial criteria. Next, we discuss keymore » physical properties that determine the macroscopic physical properties that, in turn, affect the performance of devices. Finally, we select several key topics that ought to be included in future basic research programs.« less
Characterization of basic physical properties of Sb 2Se 3 and its relevance for photovoltaics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Chao; Bobela, David C.; Yang, Ye
Antimony selenide (Sb 2Se 3) is a promising absorber material for thin film photovoltaics because of its attractive material, optical and electrical properties. In recent years, the power conversion efficiency (PCE) of Sb 2Se 3 thin film solar cells has gradually enhanced to 5.6%. In this article, we systematically studied the basic physical properties of Sb 2Se 3 such as dielectric constant, anisotropic mobility, carrier lifetime, diffusion length, defect depth, defect density and optical band tail states. Here, we believe such a comprehensive characterization of the basic physical properties of Sb 2Se 3 lays a solid foundation for further optimizationmore » of solar device performance.« less
Characterization of basic physical properties of Sb 2Se 3 and its relevance for photovoltaics
Chen, Chao; Bobela, David C.; Yang, Ye; ...
2017-03-17
Antimony selenide (Sb 2Se 3) is a promising absorber material for thin film photovoltaics because of its attractive material, optical and electrical properties. In recent years, the power conversion efficiency (PCE) of Sb 2Se 3 thin film solar cells has gradually enhanced to 5.6%. In this article, we systematically studied the basic physical properties of Sb 2Se 3 such as dielectric constant, anisotropic mobility, carrier lifetime, diffusion length, defect depth, defect density and optical band tail states. Here, we believe such a comprehensive characterization of the basic physical properties of Sb 2Se 3 lays a solid foundation for further optimizationmore » of solar device performance.« less
Center for Prostate Disease Research
... 2017 Cancer Statistics programs Clinical Research Program Synopsis Leadership Multi-Disciplinary Clinic Staff Listing 2017 Cancer Statistics Basic Science Research Program Synopsis Leadership Gene Expression Data Research Achievements Staff Listing Lab ...
Basic Aerospace Education Library
ERIC Educational Resources Information Center
Journal of Aerospace Education, 1975
1975-01-01
Lists the most significant resource items on aerospace education which are presently available. Includes source books, bibliographies, directories, encyclopedias, dictionaries, audiovisuals, curriculum/planning guides, aerospace statistics, aerospace education statistics and newsletters. (BR)
Multiple-solution problems in a statistics classroom: an example
NASA Astrophysics Data System (ADS)
Chu, Chi Wing; Chan, Kevin L. T.; Chan, Wai-Sum; Kwong, Koon-Shing
2017-11-01
The mathematics education literature shows that encouraging students to develop multiple solutions for given problems has a positive effect on students' understanding and creativity. In this paper, we present an example of multiple-solution problems in statistics involving a set of non-traditional dice. In particular, we consider the exact probability mass distribution for the sum of face values. Four different ways of solving the problem are discussed. The solutions span various basic concepts in different mathematical disciplines (sample space in probability theory, the probability generating function in statistics, integer partition in basic combinatorics and individual risk model in actuarial science) and thus promotes upper undergraduate students' awareness of knowledge connections between their courses. All solutions of the example are implemented using the R statistical software package.
Tuning the acid/base properties of nanocarbons by functionalization via amination.
Arrigo, Rosa; Hävecker, Michael; Wrabetz, Sabine; Blume, Raoul; Lerch, Martin; McGregor, James; Parrott, Edward P J; Zeitler, J Axel; Gladden, Lynn F; Knop-Gericke, Axel; Schlögl, Robert; Su, Dang Sheng
2010-07-21
The surface chemical properties and the electronic properties of vapor grown carbon nanofibers (VGCNFs) have been modified by treatment of the oxidized CNFs with NH(3). The effect of treatment temperature on the types of nitrogen functionalities introduced was evaluated by synchrotron based X-ray photoelectron spectroscopy (XPS), while the impact of the preparation methods on the surface acid-base properties was investigated by potentiometric titration, microcalorimetry, and zeta potential measurements. The impact of the N-functionalization on the electronic properties was measured by THz-Time Domain spectroscopy. The samples functionalized via amination are characterized by the coexistence of acidic and basic O and N sites. The population of O and N species is temperature dependent. In particular, at 873 K nitrogen is stabilized in substitutional positions within the graphitic structure, as heterocyclic-like moieties. The surface presents heterogeneously distributed and energetically different basic sites. A small amount of strong basic sites gives rise to a differential heat of CO(2) adsorption of 150 kJ mol(-1). However, when functionalization is carried out at 473 K, nitrogen moieties with basic character are introduced and the maximum heat of adsorption is significantly lower, at approximately 90 kJ mol(-1). In the latter sample, energetically different basic sites coexist with acidic oxygen groups introduced during the oxidative step. Under these conditions, a bifunctional acidic and basic surface is obtained with high hydrophilic character. N-functionalization carried out at higher temperature changes the electronic properties of the CNFs as evaluated by THz-TDS. The functionalization procedure presented in this work allows high versatility and flexibility in tailoring the surface chemistry of nanocarbon material to specific needs. This work shows the potential of the N-containing nanocarbon materials obtained via amination in catalysis as well as electronic device materials.
The Statistical Power of Planned Comparisons.
ERIC Educational Resources Information Center
Benton, Roberta L.
Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…
Design and basic properties of ternary gypsum-based mortars
NASA Astrophysics Data System (ADS)
Doleželová, M.; Vimmrová, A.
2017-10-01
Ternary mortars, prepared from gypsum, hydrated lime and three types of pozzolan were designed and tested. As a pozzolan admixture crushed ceramic, silica fume and granulated blast slag were used. The amount of pozzolans in the mixtures was determined according to molar weight of amorphous SiO2 in the material. The samples were stored under the water. The basic physical properties and mechanical properties were measured. The properties were compared with the properties of material without pozzolan. The best results in the water environment were achieved by the samples with silica fume.
NASA Astrophysics Data System (ADS)
Cocherie, A.; Rossi, Ph.; Le Bel, L.
1984-10-01
Petrographic and structural observations on the calc-alkalic plutonism of western Corsica revealed the existence of several successively emplaced units associated with large basic bodies. The present mineralogical and geochemical study deals with the genesis, evolution and relationships of these different units. Basic plutonism is represented by three genetically linked types of rock: norites and troctolites with cumulate textures characterized by low REE contents and either no Eu anomaly or a positive Eu anomaly; gabbros with enriched LREE relatively to HREE patterns, probably close to an initial basaltic liquid; and diorites ranging up to charnockites which represent liquids evolved to varying degrees, mainly by fractional crystallization. Trace element data and studies on the evolution of pyroxene pairs demonstrate the consanguinity of these calc-alkaline basic rocks which are derived from a high alumina basaltic melt. The various granitoids (granodiorites, monzogranites and leucocratic monzogranites, i.e., adamellites) have distinct evolution trends as shown by the composition of their mafic minerals and by trace element distributions. They cannot be considered as being derivatives of the basic suite and they cannot be related by a common fractionation sequence. Rather, they represent distinctive batches of crustal anatexis. In addition, hybridization phenomena with the basic melt are noticed in granodiorites. The particular problem of the low La/Yb, Eu/Eu∗ and the high U, Th, Cs leucocratic monzogranites is discussed in detail. In addition to more conventional trace element diagrams, the simultaneous statistical treatment of all the geochemical data by correspondence factor analysis is shown to be a very use tool in distinguishing between the different units and to classify the elements according to their geochemical properties.
THE DISCOUNTED REPRODUCTIVE NUMBER FOR EPIDEMIOLOGY
Reluga, Timothy C.; Medlock, Jan; Galvani, Alison
2013-01-01
The basic reproductive number, , and the effective reproductive number, , are commonly used in mathematical epidemiology as summary statistics for the size and controllability of epidemics. However, these commonly used reproductive numbers can be misleading when applied to predict pathogen evolution because they do not incorporate the impact of the timing of events in the life-history cycle of the pathogen. To study evolution problems where the host population size is changing, measures like the ultimate proliferation rate must be used. A third measure of reproductive success, which combines properties of both the basic reproductive number and the ultimate proliferation rate, is the discounted reproductive number . The discounted reproductive number is a measure of reproductive success that is an individual’s expected lifetime offspring production discounted by the background population growth rate. Here, we draw attention to the discounted reproductive number by providing an explicit definition and a systematic application framework. We describe how the discounted reproductive number overcomes the limitations of both the standard reproductive numbers and proliferation rates, and show that is closely connected to Fisher’s reproductive values for different life-history stages PMID:19364158
Mitomycin C and endoscopic sinus surgery: where are we?
Tabaee, Abtin; Brown, Seth M; Anand, Vijay K
2007-02-01
Mitomycin C has been used successfully in various ophthalmologic and, more recently, otolaryngologic procedures. Its modulation of fibroblast activity allows for decreased scarring and fibrosis. Several recent trials have examined the efficacy of mitomycin C in reducing synechia and stenosis following endoscopic sinus surgery. Basic science studies using fibroblast cell lines have demonstrated a dose-dependent suppression of activity with the use of mitomycin C. This is further supported by animal studies that have shown lower rates of maxillary ostial restenosis following application of mitomycin C. No human trial, however, has demonstrated a statistically significant impact of mitomycin C on the incidence of postoperative synechia or stenosis following sinus surgery. The limitations of the literature are discussed. The antiproliferative properties of mitomycin C may theoretically decrease the incidence of synechia and stenosis following endoscopic sinus surgery. Although this is supported by basic science studies and its successful use in other fields, the clinical evidence to date has not shown the application of mitomycin C to be effective in preventing stenosis after endoscopic sinus surgery. Future prospective studies are required before definitive conclusions can be made.
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 5 2010-07-01 2010-07-01 false Requests from the Bureau of Labor Statistics for data. 1904... Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses Form from the Bureau of Labor Statistics (BLS), or a BLS designee, you must promptly complete the form...
78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...
ERIC Educational Resources Information Center
Williams, Immanuel James; Williams, Kelley Kim
2016-01-01
Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.
Agundu, Prince Umor C
2003-01-01
Public health dispensaries in Nigeria in recent times have demonstrated the poise to boost corporate productivity in the new millennium and to drive the nation closer to concretising the lofty goal of health-for-all. This is very pronounced considering the face-lift giving to the physical environment, increase in the recruitment and development of professionals, and upward review of financial subventions. However, there is little or no emphasis on basic statistical appreciation/application which enhances the decision making ability of corporate executives. This study used the responses from 120 senior public health officials in Nigeria and analyzed them with chi-square statistical technique. The results established low statistical aptitude, inadequate statistical training programmes, little/no emphasis on statistical literacy compared to computer literacy, amongst others. Consequently, it was recommended that these lapses be promptly addressed to enhance official executive performance in the establishments. Basic statistical data presentation typologies have been articulated in this study to serve as first-aid instructions to the target group, as they represent the contributions of eminent scholars in this area of intellectualism.
ERIC Educational Resources Information Center
Shihua, Peng; Rihui, Tan
2009-01-01
Employing statistical analysis, this study has made a preliminary exploration of promoting the equitable development of basic education in underdeveloped counties through the case study of Cili county. The unequally developed basic education in the county has been made clear, the reasons for the inequitable education have been analyzed, and,…
41 CFR 102-76.10 - What basic design and construction policy governs Federal agencies?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What basic design and... Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL... must be timely, efficient, and cost effective. (b) Use a distinguished architectural style and form in...
Barrett, Harrison H; Myers, Kyle J; Caucci, Luca
2014-08-17
A fundamental way of describing a photon-limited imaging system is in terms of a Poisson random process in spatial, angular and wavelength variables. The mean of this random process is the spectral radiance. The principle of conservation of radiance then allows a full characterization of the noise in the image (conditional on viewing a specified object). To elucidate these connections, we first review the definitions and basic properties of radiance as defined in terms of geometrical optics, radiology, physical optics and quantum optics. The propagation and conservation laws for radiance in each of these domains are reviewed. Then we distinguish four categories of imaging detectors that all respond in some way to the incident radiance, including the new category of photon-processing detectors. The relation between the radiance and the statistical properties of the detector output is discussed and related to task-based measures of image quality and the information content of a single detected photon.
Barrett, Harrison H.; Myers, Kyle J.; Caucci, Luca
2016-01-01
A fundamental way of describing a photon-limited imaging system is in terms of a Poisson random process in spatial, angular and wavelength variables. The mean of this random process is the spectral radiance. The principle of conservation of radiance then allows a full characterization of the noise in the image (conditional on viewing a specified object). To elucidate these connections, we first review the definitions and basic properties of radiance as defined in terms of geometrical optics, radiology, physical optics and quantum optics. The propagation and conservation laws for radiance in each of these domains are reviewed. Then we distinguish four categories of imaging detectors that all respond in some way to the incident radiance, including the new category of photon-processing detectors. The relation between the radiance and the statistical properties of the detector output is discussed and related to task-based measures of image quality and the information content of a single detected photon. PMID:27478293
Chinese Mainland Movie Network
NASA Astrophysics Data System (ADS)
Liu, Ai-Fen; Xue, Yu-Hua; He, Da-Ren
2008-03-01
We propose describing a large kind of cooperation-competition networks by bipartite graphs and their unipartite projections. In the graphs the topological structure describe the cooperation-competition configuration of the basic elements, and the vertex weight describe their different roles in cooperation or results of competition. This complex network description may be helpful for finding and understanding common properties of cooperation-competition systems. In order to show an example, we performed an empirical investigation on the movie cooperation-competition network within recent 80 years in the Chinese mainland. In the net the movies are defined as nodes, and two nodes are connected by a link if a common main movie actor performs in them. The edge represents the competition relationship between two movies for more audience among a special audience colony. We obtained the statistical properties, such as the degree distribution, act degree distribution, act size distribution, and distribution of the total node weight, and explored the influence factors of Chinese mainland movie competition intensity.
A Bridge for Accelerating Materials by Design
Sumpter, Bobby G.; Vasudevan, Rama K.; Potok, Thomas E.; ...
2015-11-25
Recent technical advances in the area of nanoscale imaging, spectroscopy, and scattering/diffraction have led to unprecedented capabilities for investigating materials structural, dynamical and functional characteristics. In addition, recent advances in computational algorithms and computer capacities that are orders of magnitude larger/faster have enabled large-scale simulations of materials properties starting with nothing but the identity of the atomic species and the basic principles of quantum- and statistical-mechanics and thermodynamics. Along with these advances, an explosion of high-resolution data has emerged. This confluence of capabilities and rise of big data offer grand opportunities for advancing materials sciences but also introduce several challenges.more » In this editorial we identify challenges impeding progress towards advancing materials by design (e.g., the design/discovery of materials with improved properties/performance), possible solutions, and provide examples of scientific issues that can be addressed by using a tightly integrated approach where theory and experiments are linked through big-deep data.« less
Replication of Cancellation Orders Using First-Passage Time Theory in Foreign Currency Market
NASA Astrophysics Data System (ADS)
Boilard, Jean-François; Kanazawa, Kiyoshi; Takayasu, Hideki; Takayasu, Misako
Our research focuses on the annihilation dynamics of limit orders in a spot foreign currency market for various currency pairs. We analyze the cancellation order distribution conditioned on the normalized distance from the mid-price; where the normalized distance is defined as the final distance divided by the initial distance. To reproduce real data, we introduce two simple models that assume the market price moves randomly and cancellation occurs either after fixed time t or following the Poisson process. Results of our model qualitatively reproduce basic statistical properties of cancellation orders of the data when limit orders are cancelled according to the Poisson process. We briefly discuss implication of our findings in the construction of more detailed microscopic models.
Educating the Educator: U.S. Government Statistical Sources for Geographic Research and Teaching.
ERIC Educational Resources Information Center
Fryman, James F.; Wilkinson, Patrick J.
Appropriate for college geography students and researchers, this paper briefly introduces basic federal statistical publications and corresponding finding aids. General references include "Statistical Abstract of the United States," and three complementary publications: "County and City Data Book,""State and Metropolitan Area Data Book," and…
Statistical Cost Estimation in Higher Education: Some Alternatives.
ERIC Educational Resources Information Center
Brinkman, Paul T.; Niwa, Shelley
Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
Ethical Statistics and Statistical Ethics: Making an Interdisciplinary Module
ERIC Educational Resources Information Center
Lesser, Lawrence M.; Nordenhaug, Erik
2004-01-01
This article describes an innovative curriculum module the first author created on the two-way exchange between statistics and applied ethics. The module, having no particular mathematical prerequisites beyond high school algebra, is part of an undergraduate interdisciplinary ethics course which begins with a 3-week introduction to basic applied…
The space of ultrametric phylogenetic trees.
Gavryushkin, Alex; Drummond, Alexei J
2016-08-21
The reliability of a phylogenetic inference method from genomic sequence data is ensured by its statistical consistency. Bayesian inference methods produce a sample of phylogenetic trees from the posterior distribution given sequence data. Hence the question of statistical consistency of such methods is equivalent to the consistency of the summary of the sample. More generally, statistical consistency is ensured by the tree space used to analyse the sample. In this paper, we consider two standard parameterisations of phylogenetic time-trees used in evolutionary models: inter-coalescent interval lengths and absolute times of divergence events. For each of these parameterisations we introduce a natural metric space on ultrametric phylogenetic trees. We compare the introduced spaces with existing models of tree space and formulate several formal requirements that a metric space on phylogenetic trees must possess in order to be a satisfactory space for statistical analysis, and justify them. We show that only a few known constructions of the space of phylogenetic trees satisfy these requirements. However, our results suggest that these basic requirements are not enough to distinguish between the two metric spaces we introduce and that the choice between metric spaces requires additional properties to be considered. Particularly, that the summary tree minimising the square distance to the trees from the sample might be different for different parameterisations. This suggests that further fundamental insight is needed into the problem of statistical consistency of phylogenetic inference methods. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
ERIC Educational Resources Information Center
Orton, Larry
2009-01-01
This document outlines the definitions and the typology now used by Statistics Canada's Centre for Education Statistics to identify, classify and delineate the universities, colleges and other providers of postsecondary and adult education in Canada for which basic enrollments, graduates, professors and finance statistics are produced. These new…
ERIC Educational Resources Information Center
North, Delia; Gal, Iddo; Zewotir, Temesgen
2014-01-01
This paper aims to contribute to the emerging literature on capacity-building in statistics education by examining issues pertaining to the readiness of teachers in a developing country to teach basic statistical topics. The paper reflects on challenges and barriers to building statistics capacity at grass-roots level in a developing country,…
County-by-County Financial and Staffing I-M-P-A-C-T. FY 1994-95 Basic Education Program.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh.
This publication provides the basic statistics needed to illustrate the impact of North Carolina's Basic Education Program (BEP), an educational reform effort begun in 1985. Over 85% of the positions in the BEP are directly related to teaching and student-related activities. The new BEP programs result in smaller class sizes in kindergartens and…
Double-row vs single-row rotator cuff repair: a review of the biomechanical evidence.
Wall, Lindley B; Keener, Jay D; Brophy, Robert H
2009-01-01
A review of the current literature will show a difference between the biomechanical properties of double-row and single-row rotator cuff repairs. Rotator cuff tears commonly necessitate surgical repair; however, the optimal technique for repair continues to be investigated. Recently, double-row repairs have been considered an alternative to single-row repair, allowing a greater coverage area for healing and a possibly stronger repair. We reviewed the literature of all biomechanical studies comparing double-row vs single-row repair techniques. Inclusion criteria included studies using cadaveric, animal, or human models that directly compared double-row vs single-row repair techniques, written in the English language, and published in peer reviewed journals. Identified articles were reviewed to provide a comprehensive conclusion of the biomechanical strength and integrity of the repair techniques. Fifteen studies were identified and reviewed. Nine studies showed a statistically significant advantage to a double-row repair with regards to biomechanical strength, failure, and gap formation. Three studies produced results that did not show any statistical advantage. Five studies that directly compared footprint reconstruction all demonstrated that the double-row repair was superior to a single-row repair in restoring anatomy. The current literature reveals that the biomechanical properties of a double-row rotator cuff repair are superior to a single-row repair. Basic Science Study, SRH = Single vs. Double Row RCR.
Process property studies of melt blown thermoplastic polyurethane polymers
NASA Astrophysics Data System (ADS)
Lee, Youn Eung
The primary goal of this research was to determine optimum processing conditions to produce commercially acceptable melt blown (MB) thermoplastic polyurethane (TPU) webs. The 6-inch MB line and the 20-inch wide Accurate Products MB pilot line at the Textiles and Nonwovens Development Center (TANDEC), The University of Tennessee, Knoxville, were utilized for this study. The MB TPU trials were performed in four different phases: Phase 1 focused on the envelope of the MB operating conditions for different TPU polymers; Phase 2 focused on the production of commercially acceptable MB TPU webs; Phase 3 focused on the optimization of the processing conditions of MB TPU webs, and the determination of the significant relationships between processing parameters and web properties utilizing statistical analyses; Based on the first three phases, a more extensive study of fiber and web formation in the MB TPU process was made and a multi liner regression model for the MB TPU process versus properties was also developed in Phase 4. In conclusion, the basic MB process was fundamentally valid for the MB TPU process; however, the MB process was more complicated for TPU than PP, because web structures and properties of MB TPUs are very sensitive to MB process conditions: Furthermore, different TPU grades responded very differently to MB processing and exhibited different web structure and properties. In Phase 3 and Phase 4, small fiber diameters of less than 5mum were produced from TPU237, TPU245 and TPU280 pellets, and the mechanical strengths of MB TPU webs including the tensile strength, tear strength, abrasion resistance and tensile elongation were notably good. In addition, the statistical model showed useful interaction regarding trends for processing parameters versus properties of MB TPU webs. Die and air temperature showed multicollinearity problems and fiber diameter was notably affected by air flow rate, throughput and die/air temperature. It was also shown that most of the MB TPU web properties including mechanical strength, air permeability and fiber diameters were affected by air velocity and die temperature.
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2005-10-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.
ERIC Educational Resources Information Center
Center for Education Statistics (ED/OERI), Washington, DC.
The Financial Statistics machine-readable data file (MRDF) is a subfile of the larger Higher Education General Information Survey (HEGIS). It contains basic financial statistics for over 3,000 institutions of higher education in the United States and its territories. The data are arranged sequentially by institution, with institutional…
The Greyhound Strike: Using a Labor Dispute to Teach Descriptive Statistics.
ERIC Educational Resources Information Center
Shatz, Mark A.
1985-01-01
A simulation exercise of a labor-management dispute is used to teach psychology students some of the basics of descriptive statistics. Using comparable data sets generated by the instructor, students work in small groups to develop a statistical presentation that supports their particular position in the dispute. (Author/RM)
Tzonev, Svilen
2018-01-01
Current commercially available digital PCR (dPCR) systems and assays are capable of detecting individual target molecules with considerable reliability. As tests are developed and validated for use on clinical samples, the need to understand and develop robust statistical analysis routines increases. This chapter covers the fundamental processes and limitations of detecting and reporting on single molecule detection. We cover the basics of quantification of targets and sources of imprecision. We describe the basic test concepts: sensitivity, specificity, limit of blank, limit of detection, and limit of quantification in the context of dPCR. We provide basic guidelines how to determine those, how to choose and interpret the operating point, and what factors may influence overall test performance in practice.
NASA Astrophysics Data System (ADS)
Zhang, Y. M.; Evans, J. R. G.; Yang, S. F.
2010-11-01
The authors have discovered a systematic, intelligent and potentially automatic method to detect errors in handbooks and stop their transmission using unrecognised relationships between materials properties. The scientific community relies on the veracity of scientific data in handbooks and databases, some of which have a long pedigree covering several decades. Although various outlier-detection procedures are employed to detect and, where appropriate, remove contaminated data, errors, which had not been discovered by established methods, were easily detected by our artificial neural network in tables of properties of the elements. We started using neural networks to discover unrecognised relationships between materials properties and quickly found that they were very good at finding inconsistencies in groups of data. They reveal variations from 10 to 900% in tables of property data for the elements and point out those that are most probably correct. Compared with the statistical method adopted by Ashby and co-workers [Proc. R. Soc. Lond. Ser. A 454 (1998) p. 1301, 1323], this method locates more inconsistencies and could be embedded in database software for automatic self-checking. We anticipate that our suggestion will be a starting point to deal with this basic problem that affects researchers in every field. The authors believe it may eventually moderate the current expectation that data field error rates will persist at between 1 and 5%.
Simple route for nano-hydroxyapatite properties expansion.
Rojas, L; Olmedo, H; García-Piñeres, A J; Silveira, C; Tasic, L; Fraga, F; Montero, M L
2015-10-20
Simple surface modification of nano-hydroxyapatite, through acid-basic reactions, allows expanding the properties of this material. Introduction of organic groups such as hydrophobic alkyl chains, carboxylic acid, and amide or amine basic groups on the hydroxyapatite surface systematically change the polarity, surface area, and reactivity of hydroxyapatite without modifying its phase. Physical and chemical properties of the new derivative particles were analyzed. The biocompatibility of modified Nano-Hap on Raw 264.7 cells was also assessed.
Hydraulic and thermal soil Parameter combined with TEM data at quaternary coastal regions
NASA Astrophysics Data System (ADS)
Grabowski, Ima; Kirsch, Reinhard; Scheer, Wolfgang
2014-05-01
In order to generate a more efficient method of planning and dimensioning small- and medium sized geothermal power plants at quaternary subsurface a basic approach has been attempted. Within the EU-project CLIWAT, the coastal region of Denmark, Germany, Netherlands and Belgium has been investigated and air borne electro magnetic data was collected. In this work the regional focus was put on the isle of Föhr. To describe the subsurface with relevant parameters one need the information from drillings and geophysical well logging data. The approach to minimize costs and use existing data from state agencies led the investigation to the combination of specific electrical resistivity data and hydraulic and thermal conductivity. We worked out a basic soil/hydraulic conductivity statistic for the isle of Föhr by gathering all well logging data from the island and sorted the existing soil materials to associated kf -values. We combined specific electrical resistivity with hydraulic soil properties to generate thermal conductivity values by extracting porosity. Until now we generated a set of rough data for kf - values and thermal conductivity. The air borne TEM data sets are reliable up to 150 m below surface, depending on the conductivity of the layers. So we can suppose the same for the differentiated parameters. Since this is a very rough statistic of kf -values, further more investigation has to be made. Although the close connection to each area of investigation either over existing logging data or laboratory soil property values will remain necessary. Literature: Ahmed S, de Marsily G, Talbot A (1988): Combined Use of Hydraulic and Electrical Properties of an Aquifer in a Geostatistical Estimation of Transmissivity. - Groundwater, vol. 26 (1) Burschil T, Scheer W, Wiederhold H, Kirsch R (2012): Groundwater situation on a glacially affected barrier island. Submitted to Hydrology and Earth System Sciences - an Interactive Open Access Journal of the European Geosciences Union Burval Working Group (2006) Groundwater Resources in buried valleys- a challenge for Geosciences. - Leibniz-Institut für Angewandte Geophysik, Hannover Scheer W, König B, Steinmann F (2012): Die Grundwasserverhältnisse von Föhr. - In: Der Untergrund von Föhr: Geologie, Grundwasser und Erdwärme - Ergebnisse des INTERREG-Projektes CLIWAT. - Landesamt für Landwirtschaft, Umwelt und ländliche Räume Schleswig-Holstein, Flintbek
Nurses' foot care activities in home health care.
Stolt, Minna; Suhonen, Riitta; Puukka, Pauli; Viitanen, Matti; Voutilainen, Päivi; Leino-Kilpi, Helena
2013-01-01
This study described the basic foot care activities performed by nurses and factors associated with these in the home care of older people. Data were collected from nurses (n=322) working in nine public home care agencies in Finland using the Nurses' Foot Care Activities Questionnaire (NFAQ). Data were analyzed statistically using descriptive statistics and multivariate liner models. Although some of the basic foot care activities of nurses reported using were outdated, the majority of foot care activities were consistent with recommendations in foot care literature. Longer working experience, referring patients with foot problems to a podiatrist and physiotherapist, and patient education in wart and nail care were associated with a high score for adequate foot care activities. Continuing education should focus on updating basic foot care activities and increasing the use of evidence-based foot care methods. Also, geriatric nursing research should focus in intervention research to improve the use of evidence-based basic foot care activities. Copyright © 2013 Mosby, Inc. All rights reserved.
Alternative Fuels Characterization | Transportation Research | NREL
. Research at NREL focuses on the basic properties of these fuels and what levels of oxygen can be tolerated conventional cars and on understanding the performance of flex-fuel vehicles that can operate on ethanol levels basic properties of these fuels, as well as determining what levels of oxygen can be tolerated in drop
41 CFR 102-83.10 - What basic location of space policy governs an Executive agency?
Code of Federal Regulations, 2014 CFR
2014-01-01
... space policy governs an Executive agency? 102-83.10 Section 102-83.10 Public Contracts and Property... PROPERTY 83-LOCATION OF SPACE General Provisions § 102-83.10 What basic location of space policy governs an... delineated area within which it wishes to locate specific activities, consistent with its mission and program...
41 CFR 102-83.10 - What basic location of space policy governs an Executive agency?
Code of Federal Regulations, 2012 CFR
2012-01-01
... space policy governs an Executive agency? 102-83.10 Section 102-83.10 Public Contracts and Property... PROPERTY 83-LOCATION OF SPACE General Provisions § 102-83.10 What basic location of space policy governs an... delineated area within which it wishes to locate specific activities, consistent with its mission and program...
41 CFR 102-83.10 - What basic location of space policy governs an Executive agency?
Code of Federal Regulations, 2013 CFR
2013-07-01
... space policy governs an Executive agency? 102-83.10 Section 102-83.10 Public Contracts and Property... PROPERTY 83-LOCATION OF SPACE General Provisions § 102-83.10 What basic location of space policy governs an... delineated area within which it wishes to locate specific activities, consistent with its mission and program...
41 CFR 102-83.10 - What basic location of space policy governs an Executive agency?
Code of Federal Regulations, 2011 CFR
2011-01-01
... space policy governs an Executive agency? 102-83.10 Section 102-83.10 Public Contracts and Property... PROPERTY 83-LOCATION OF SPACE General Provisions § 102-83.10 What basic location of space policy governs an... delineated area within which it wishes to locate specific activities, consistent with its mission and program...
41 CFR 102-83.10 - What basic location of space policy governs an Executive agency?
Code of Federal Regulations, 2010 CFR
2010-07-01
... space policy governs an Executive agency? 102-83.10 Section 102-83.10 Public Contracts and Property... PROPERTY 83-LOCATION OF SPACE General Provisions § 102-83.10 What basic location of space policy governs an... delineated area within which it wishes to locate specific activities, consistent with its mission and program...
John F. Hunt
1998-01-01
The following results are preliminary, but show some basic information that will be used in an attempt to model pulp molded structures so that by measuring several basic fundamental properties of a fiber furnish and specifying process conditions, a molded structure could be designed for a particular performance need.
ERIC Educational Resources Information Center
Opfer, John E.; Siegler, Robert S.
2004-01-01
Many preschoolers know that plants and animals share basic biological properties, but this knowledge does not usually lead them to conclude that plants, like animals, are living things. To resolve this seeming paradox, we hypothesized that preschoolers largely base their judgments of life status on a biological property, capacity for teleological…
Balas, Benjamin
2016-11-01
Peripheral visual perception is characterized by reduced information about appearance due to constraints on how image structure is represented. Visual crowding is a consequence of excessive integration in the visual periphery. Basic phenomenology of visual crowding and other tasks have been successfully accounted for by a summary-statistic model of pooling, suggesting that texture-like processing is useful for how information is reduced in peripheral vision. I attempt to extend the scope of this model by examining a property of peripheral vision: reduced perceived numerosity in the periphery. I demonstrate that a summary-statistic model of peripheral appearance accounts for reduced numerosity in peripherally viewed arrays of randomly placed dots, but does not account for observed effects of dot clustering within such arrays. The model thus offers a limited account of how numerosity is perceived in the visual periphery. I also demonstrate that the model predicts that numerosity estimation is sensitive to element shape, which represents a novel prediction regarding the phenomenology of peripheral numerosity perception. Finally, I discuss ways to extend the model to a broader range of behavior and the potential for using the model to make further predictions about how number is perceived in untested scenarios in peripheral vision.
Developing Competency of Teachers in Basic Education Schools
ERIC Educational Resources Information Center
Yuayai, Rerngrit; Chansirisira, Pacharawit; Numnaphol, Kochaporn
2015-01-01
This study aims to develop competency of teachers in basic education schools. The research instruments included the semi-structured in-depth interview form, questionnaire, program developing competency, and evaluation competency form. The statistics used for data analysis were percentage, mean, and standard deviation. The research found that…
Tarasova, Irina A; Goloborodko, Anton A; Perlova, Tatyana Y; Pridatchenko, Marina L; Gorshkov, Alexander V; Evreinov, Victor V; Ivanov, Alexander R; Gorshkov, Mikhail V
2015-07-07
The theory of critical chromatography for biomacromolecules (BioLCCC) describes polypeptide retention in reversed-phase HPLC using the basic principles of statistical thermodynamics. However, whether this theory correctly depicts a variety of empirical observations and laws introduced for peptide chromatography over the last decades remains to be determined. In this study, by comparing theoretical results with experimental data, we demonstrate that the BioLCCC: (1) fits the empirical dependence of the polypeptide retention on the amino acid sequence length with R(2) > 0.99 and allows in silico determination of the linear regression coefficients of the log-length correction in the additive model for arbitrary sequences and lengths and (2) predicts the distribution coefficients of polypeptides with an accuracy from 0.98 to 0.99 R(2). The latter enables direct calculation of the retention factors for given solvent compositions and modeling of the migration dynamics of polypeptides separated under isocratic or gradient conditions. The obtained results demonstrate that the suggested theory correctly relates the main aspects of polypeptide separation in reversed-phase HPLC.
Monte Carlo investigation of thrust imbalance of solid rocket motor pairs
NASA Technical Reports Server (NTRS)
Sforzini, R. H.; Foster, W. A., Jr.
1976-01-01
The Monte Carlo method of statistical analysis is used to investigate the theoretical thrust imbalance of pairs of solid rocket motors (SRMs) firing in parallel. Sets of the significant variables are selected using a random sampling technique and the imbalance calculated for a large number of motor pairs using a simplified, but comprehensive, model of the internal ballistics. The treatment of burning surface geometry allows for the variations in the ovality and alignment of the motor case and mandrel as well as those arising from differences in the basic size dimensions and propellant properties. The analysis is used to predict the thrust-time characteristics of 130 randomly selected pairs of Titan IIIC SRMs. A statistical comparison of the results with test data for 20 pairs shows the theory underpredicts the standard deviation in maximum thrust imbalance by 20% with variability in burning times matched within 2%. The range in thrust imbalance of Space Shuttle type SRM pairs is also estimated using applicable tolerances and variabilities and a correction factor based on the Titan IIIC analysis.
NASA Astrophysics Data System (ADS)
Saez, Núria; Ruiz, Xavier; Pallarés, Jordi; Shevtsova, Valentina
2013-04-01
An accelerometric record from the IVIDIL experiment (ESA Columbus module) has exhaustively been studied. The analysis involved the determination of basic statistical properties as, for instance, the auto-correlation and the power spectrum (second-order statistical analyses). Also, and taking into account the shape of the associated histograms, we address another important question, the non-Gaussian nature of the time series using the bispectrum and the bicoherence of the signals. Extrapolating the above-mentioned results, a computational model of a high-temperature shear cell has been performed. A scalar indicator has been used to quantify the accuracy of the diffusion coefficient measurements in the case of binary mixtures involving photovoltaic silicon or liquid Al-Cu binary alloys. Three different initial arrangements have been considered, the so-called interdiffusion, centred thick layer and the lateral thick layer. Results allow us to conclude that, under the conditions of the present work, the diffusion coefficient is insensitive to the environmental conditions, that is to say, accelerometric disturbances and initial shear cell arrangement.
Self-Organization: Complex Dynamical Systems in the Evolution of Speech
NASA Astrophysics Data System (ADS)
Oudeyer, Pierre-Yves
Human vocalization systems are characterized by complex structural properties. They are combinatorial, based on the systematic reuse of phonemes, and the set of repertoires in human languages is characterized by both strong statistical regularities—universals—and a great diversity. Besides, they are conventional codes culturally shared in each community of speakers. What are the origins of the forms of speech? What are the mechanisms that permitted their evolution in the course of phylogenesis and cultural evolution? How can a shared speech code be formed in a community of individuals? This chapter focuses on the way the concept of self-organization, and its interaction with natural selection, can throw light on these three questions. In particular, a computational model is presented which shows that a basic neural equipment for adaptive holistic vocal imitation, coupling directly motor and perceptual representations in the brain, can generate spontaneously shared combinatorial systems of vocalizations in a society of babbling individuals. Furthermore, we show how morphological and physiological innate constraints can interact with these self-organized mechanisms to account for both the formation of statistical regularities and diversity in vocalization systems.
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working the elementary schools of Cordoba, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Narino, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Cauca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Caldas, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Boyaca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Huila, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.
This report is a part of the program of the National Center for Health Statistics to provide current statistics as baseline data for the evaluation, planning, and administration of health programs. Part I presents data concerning the occupational fields: (1) administration, (2) anthropology and sociology, (3) data processing, (4) basic sciences,…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teacher personnel working in Colombian elementary schools between 1940 and 1968. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of teachers. (VM)
Explorations in Statistics: Standard Deviations and Standard Errors
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2008-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This series in "Advances in Physiology Education" provides an opportunity to do just that: we will investigate basic concepts in statistics using the free software package R. Because this series uses R solely as a vehicle…
ERIC Educational Resources Information Center
Cassel, Russell N.
This paper relates educational and psychological statistics to certain "Research Statistical Tools" (RSTs) necessary to accomplish and understand general research in the behavioral sciences. Emphasis is placed on acquiring an effective understanding of the RSTs and to this end they are are ordered to a continuum scale in terms of individual…
Estimates of School Statistics, 1971-72.
ERIC Educational Resources Information Center
Flanigan, Jean M.
This report presents public school statistics for the 50 States, the District of Columbia, and the regions and outlying areas of the United States. The text presents national data for each of the past 10 years and defines the basic series of statistics. Tables present the revised estimates by State and region for 1970-71 and the preliminary…
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
The statistical average of optical properties for alumina particle cluster in aircraft plume
NASA Astrophysics Data System (ADS)
Li, Jingying; Bai, Lu; Wu, Zhensen; Guo, Lixin
2018-04-01
We establish a model for lognormal distribution of monomer radius and number of alumina particle clusters in plume. According to the Multi-Sphere T Matrix (MSTM) theory, we provide a method for finding the statistical average of optical properties for alumina particle clusters in plume, analyze the effect of different distributions and different detection wavelengths on the statistical average of optical properties for alumina particle cluster, and compare the statistical average optical properties under the alumina particle cluster model established in this study and those under three simplified alumina particle models. The calculation results show that the monomer number of alumina particle cluster and its size distribution have a considerable effect on its statistical average optical properties. The statistical average of optical properties for alumina particle cluster at common detection wavelengths exhibit obvious differences, whose differences have a great effect on modeling IR and UV radiation properties of plume. Compared with the three simplified models, the alumina particle cluster model herein features both higher extinction and scattering efficiencies. Therefore, we may find that an accurate description of the scattering properties of alumina particles in aircraft plume is of great significance in the study of plume radiation properties.
Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876
Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.
Senior Computational Scientist | Center for Cancer Research
The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results
Memcapacitor model and its application in chaotic oscillator with memristor.
Wang, Guangyi; Zang, Shouchi; Wang, Xiaoyuan; Yuan, Fang; Iu, Herbert Ho-Ching
2017-01-01
Memristors and memcapacitors are two new nonlinear elements with memory. In this paper, we present a Hewlett-Packard memristor model and a charge-controlled memcapacitor model and design a new chaotic oscillator based on the two models for exploring the characteristics of memristors and memcapacitors in nonlinear circuits. Furthermore, many basic dynamical behaviors of the oscillator, including equilibrium sets, Lyapunov exponent spectrums, and bifurcations with various circuit parameters, are investigated theoretically and numerically. Our analysis results show that the proposed oscillator possesses complex dynamics such as an infinite number of equilibria, coexistence oscillation, and multi-stability. Finally, a discrete model of the chaotic oscillator is given and the main statistical properties of this oscillator are verified via Digital Signal Processing chip experiments and National Institute of Standards and Technology tests.
Semantic memory: a feature-based analysis and new norms for Italian.
Montefinese, Maria; Ambrosini, Ettore; Fairfield, Beth; Mammarella, Nicola
2013-06-01
Semantic norms for properties produced by native speakers are valuable tools for researchers interested in the structure of semantic memory and in category-specific semantic deficits in individuals following brain damage. The aims of this study were threefold. First, we sought to extend existing semantic norms by adopting an empirical approach to category (Exp. 1) and concept (Exp. 2) selection, in order to obtain a more representative set of semantic memory features. Second, we extensively outlined a new set of semantic production norms collected from Italian native speakers for 120 artifactual and natural basic-level concepts, using numerous measures and statistics following a feature-listing task (Exp. 3b). Finally, we aimed to create a new publicly accessible database, since only a few existing databases are publicly available online.
mrpy: Renormalized generalized gamma distribution for HMF and galaxy ensemble properties comparisons
NASA Astrophysics Data System (ADS)
Murray, Steven G.; Robotham, Aaron S. G.; Power, Chris
2018-02-01
mrpy calculates the MRP parameterization of the Halo Mass Function. It calculates basic statistics of the truncated generalized gamma distribution (TGGD) with the TGGD class, including mean, mode, variance, skewness, pdf, and cdf. It generates MRP quantities with the MRP class, such as differential number counts and cumulative number counts, and offers various methods for generating normalizations. It can generate the MRP-based halo mass function as a function of physical parameters via the mrp_b13 function, and fit MRP parameters to data in the form of arbitrary curves and in the form of a sample of variates with the SimFit class. mrpy also calculates analytic hessians and jacobians at any point, and allows the user to alternate parameterizations of the same form via the reparameterize module.
Simulated cosmic microwave background maps at 0.5 deg resolution: Basic results
NASA Technical Reports Server (NTRS)
Hinshaw, G.; Bennett, C. L.; Kogut, A.
1995-01-01
We have simulated full-sky maps of the cosmic microwave background (CMB) anisotropy expected from cold dark matter (CDM) models at 0.5 deg and 1.0 deg angular resolution. Statistical properties of the maps are presented as a function of sky coverage, angular resolution, and instrument noise, and the implications of these results for observability of the Doppler peak are discussed. The rms fluctuations in a map are not a particularly robust probe of the existence of a Doppler peak; however, a full correlation analysis can provide reasonable sensitivity. We find that sensitivity to the Doppler peak depends primarily on the fraction of sky covered, and only secondarily on the angular resolution and noise level. Color plates of the simulated maps are presented to illustrate the anisotropies.
Brennan, Jennifer Sousa
2010-01-01
This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.
System analysis for the Huntsville Operational Support Center distributed computer system
NASA Technical Reports Server (NTRS)
Ingels, E. M.
1983-01-01
A simulation model was developed and programmed in three languages BASIC, PASCAL, and SLAM. Two of the programs are included in this report, the BASIC and the PASCAL language programs. SLAM is not supported by NASA/MSFC facilities and hence was not included. The statistical comparison of simulations of the same HOSC system configurations are in good agreement and are in agreement with the operational statistics of HOSC that were obtained. Three variations of the most recent HOSC configuration was run and some conclusions drawn as to the system performance under these variations.
NASA Astrophysics Data System (ADS)
Haven, Emmanuel; Khrennikov, Andrei
2013-01-01
Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.
Chekmarev, Sergei F
2013-03-01
The transition from laminar to turbulent fluid motion occurring at large Reynolds numbers is generally associated with the instability of the laminar flow. On the other hand, since the turbulent flow characteristically appears in the form of spatially localized structures (e.g., eddies) filling the flow field, a tendency to occupy such a structured state of the flow cannot be ruled out as a driving force for turbulent transition. To examine this possibility, we propose a simple analytical model that treats the flow as a collection of localized spatial structures, each of which consists of elementary cells in which the behavior of the particles (atoms or molecules) is uncorrelated. This allows us to introduce the Reynolds number, associating it with the ratio between the total phase volume for the system and that for the elementary cell. Using the principle of maximum entropy to calculate the most probable size distribution of the localized structures, we show that as the Reynolds number increases, the elementary cells group into the localized structures, which successfully explains turbulent transition and some other general properties of turbulent flows. An important feature of the present model is that a bridge between the spatial-statistical description of the flow and hydrodynamic equations is established. We show that the basic assumptions underlying the model, i.e., that the particles are indistinguishable and elementary volumes of phase space exist in which the state of the particles is uncertain, are involved in the derivation of the Navier-Stokes equation. Taking into account that the model captures essential features of turbulent flows, this suggests that the driving force for the turbulent transition is basically the same as in the present model, i.e., the tendency of the system to occupy a statistically dominant state plays a key role. The instability of the flow at high Reynolds numbers can then be a mechanism to initiate structural rearrangement of the flow to find this state.
NASA Astrophysics Data System (ADS)
Cardall, Christian Y.; Budiardja, Reuben D.
2017-05-01
GenASiS Basics provides Fortran 2003 classes furnishing extensible object-oriented utilitarian functionality for large-scale physics simulations on distributed memory supercomputers. This functionality includes physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. This revision -Version 2 of Basics - makes mostly minor additions to functionality and includes some simplifying name changes.
NASA Astrophysics Data System (ADS)
Pukhovskaya, S. G.; Ivanova, Yu. B.; Nam, Dao The; Vashurin, A. S.
2014-10-01
Spectrophotometric titration is used to study the basic properties of a series of porphyrins with a continuously increasing degree of macrocycle deformation resulting from the introduction of strong electron-withdrawing substituents: 2,3,7,8,12,13,17,18-octaethylporphyrin ( I), 5-nitro-2,3,7,8,12,13,17,18-octaethylporphyrin ( II), 5,15-dinitro-2,3,7,8,12,13,17,18-octaethylporphyrin ( III), 5,10,15-trinitro-2,3,7,8,12,13,17,18-octaethylporphyrin ( IV), and 5,10,15,20-tetranitro-2,3,7,8,12,13,17,18-octaethylporphyrin ( V). It is found that the values of log K b (total basicity constants) obtained for the investigated compounds consistently diminish with an increase in the number of meso-substituents: 11.85 ( I) > 10.45 ( II) > 10.31 ( III) > 10.23 ( IV) > 9.56 ( V). It is shown that two opposing factors, the steric and electronic effects of the substituents, change the basic properties of the above series of compounds.
Basic Facts and Figures about the Educational System in Japan.
ERIC Educational Resources Information Center
National Inst. for Educational Research, Tokyo (Japan).
Tables, charts, and graphs convey supporting data that accompany text on various aspects of the Japanese educational system presented in this booklet. There are seven chapters: (1) Fundamental principles of education; (2) Organization of the educational system; (3) Basic statistics of education; (4) Curricula, textbooks, and instructional aids;…
ERIC Educational Resources Information Center
Yantz, Jennifer
2013-01-01
The attainment and retention of later algebra skills in high school has been identified as a factor significantly impacting the postsecondary success of students majoring in STEM fields. Researchers maintain that learners develop meaning for algebraic procedures by forming connections to the basic number system properties. The present study…
Technetium-99m: basic nuclear physics and chemical properties.
Castronovo, F P
1975-05-01
The nuclear physics and chemical properties of technetium-99m are reviewed. The review of basic nuclear physics includes: classification of nuclides, nuclear stability, production of radionuclides, artificial production of molybdenum-99, production of technetium 99m and -99Mo-99mTc generators. The discussion of the chemistry of technetium includes a profile of several -99mCc-labeled radiopharmaceuticals.
Current and Future X-ray Studies of High-Redshift AGNs and the First Supermassive Black Holes
NASA Astrophysics Data System (ADS)
Brandt, Niel
2016-01-01
X-ray observations of high-redshift AGNs at z = 4-7 have played a critical role in understanding the physical processes at work inthese objects as well as their basic demographics. Since 2000, Chandra and XMM-Newton have provided new X-ray detections for more than 120 such objects, and well-defined samples of z > 4 AGNs now allow reliable X-ray population studies. Once luminosity effectsare considered, the basic X-ray continuum properties of most high-redshift AGNs appear remarkably similar to those of local AGNs, although there are some notable apparent exceptions (e.g., highly radio-loud quasars). Furthermore, the X-ray absorption found in some objects has been used as a diagnostic of outflowing winds and circumnuclear material. Demographically, the X-ray data now support an exponential decline in the number density of luminous AGNs above z ~ 3, and quantitative space-density comparisons for optically selected and X-ray selected quasars indicate basic statistical agreement.The current X-ray discoveries point the way toward the future breakthroughs that will be possible with, e.g., Athena and the X-raySurveyor. These missions will execute powerful blank-field surveys to elucidate the demographics of the first growing supermassive black holes (SMBHs), including highly obscured systems, up to z ~ 10. They will also carry out complementary X-ray spectroscopic and variability investigations of high-redshift AGNs by targeting the most-luminous z = 7-10 quasars found in wide-field surveys by, e.g., Euclid, LSST, and WFIRST. X-ray spectroscopic and variability studies of the X-ray continuum and reflection signatures will help determine Eddington ratios and disk/corona properties; measuring these will clarify how the first quasars grew so quickly. Furthermore, absorption line/edge studies will reveal how outflows from the first SMBHs influenced the growth of the first galaxies. I will suggest some efficient observational strategies for Athena and the X-ray Surveyor.
Order-of-magnitude physics of neutron stars. Estimating their properties from first principles
NASA Astrophysics Data System (ADS)
Reisenegger, Andreas; Zepeda, Felipe S.
2016-03-01
We use basic physics and simple mathematics accessible to advanced undergraduate students to estimate the main properties of neutron stars. We set the stage and introduce relevant concepts by discussing the properties of "everyday" matter on Earth, degenerate Fermi gases, white dwarfs, and scaling relations of stellar properties with polytropic equations of state. Then, we discuss various physical ingredients relevant for neutron stars and how they can be combined in order to obtain a couple of different simple estimates of their maximum mass, beyond which they would collapse, turning into black holes. Finally, we use the basic structural parameters of neutron stars to briefly discuss their rotational and electromagnetic properties.
Statistical Analysis of CMC Constituent and Processing Data
NASA Technical Reports Server (NTRS)
Fornuff, Jonathan
2004-01-01
Ceramic Matrix Composites (CMCs) are the next "big thing" in high-temperature structural materials. In the case of jet engines, it is widely believed that the metallic superalloys currently being utilized for hot structures (combustors, shrouds, turbine vanes and blades) are nearing their potential limits of improvement. In order to allow for increased turbine temperatures to increase engine efficiency, material scientists have begun looking toward advanced CMCs and SiC/SiC composites in particular. Ceramic composites provide greater strength-to-weight ratios at higher temperatures than metallic alloys, but at the same time require greater challenges in micro-structural optimization that in turn increases the cost of the material as well as increases the risk of variability in the material s thermo-structural behavior. to model various potential CMC engine materials and examines the current variability in these properties due to variability in component processing conditions and constituent materials; then, to see how processing and constituent variations effect key strength, stiffness, and thermal properties of the finished components. Basically, this means trying to model variations in the component s behavior by knowing what went into creating it. inter-phase and manufactured by chemical vapor infiltration (CVI) and melt infiltration (MI) were considered. Examinations of: (1) the percent constituents by volume, (2) the inter-phase thickness, (3) variations in the total porosity, and (4) variations in the chemical composition of the Sic fiber are carried out and modeled using various codes used here at NASA-Glenn (PCGina, NASALife, CEMCAN, etc...). The effects of these variations and the ranking of their respective influences on the various thermo-mechanical material properties are studied and compared to available test data. The properties of the materials as well as minor changes to geometry are then made to the computer model and the detrimental effects observed using statistical analysis software. The ultimate purpose of this study is to determine what variations in material processing can lead to the most critical changes in the materials property. The work I have taken part in this summer explores, in general, the key properties needed In this study SiC/SiC composites of varying architectures, utilizing a boron-nitride (BN)
Operational health physics training
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1992-06-01
The initial four sections treat basic information concerning atomic structure and other useful physical quantities, natural radioactivity, the properties of {alpha}, {beta}, {gamma}, x rays and neutrons, and the concepts and units of radiation dosimetry (including SI units). Section 5 deals with biological effects and the risks associated with radiation exposure. Background radiation and man-made sources are discussed next. The basic recommendations of the ICRP concerning dose limitations: justification, optimization (ALARA concepts and applications) and dose limits are covered in Section seven. Section eight is an expanded version of shielding, and the internal dosimetry discussion has been extensively revised tomore » reflect the concepts contained in the MIRD methodology and ICRP 30. The remaining sections discuss the operational health physics approach to monitoring radiation. Individual sections include radiation detection principles, instrument operation and counting statistics, health physics instruments and personnel monitoring devices. The last five sections deal with the nature of, operation principles of, health physics aspects of, and monitoring approaches to air sampling, reactors, nuclear safety, gloveboxes and hot cells, accelerators and x ray sources. Decontamination, waste disposal and transportation of radionuclides are added topics. Several appendices containing constants, symbols, selected mathematical topics, and the Chart of the Nuclides, and an index have been included.« less
Patient-reported outcomes in idiopathic pulmonary fibrosis research.
Swigris, Jeffrey J; Fairclough, Diane
2012-08-01
Patient-reported outcomes (PROs) include questionnaires or surveys that ask patients for their perceptions about things like symptoms they are experiencing or quality of life. For incurable, morbid, life-shortening conditions like idiopathic pulmonary fibrosis (IPF), PROs are particularly germane: They elucidate for clinicians and researchers what it is like for patients to live with such a disease, and they may detect important treatment effects not captured by other metrics (eg, pulmonary physiology). However, a relative paucity of research on PROs in IPF has left significant knowledge gaps in this area and contributed to the timidity investigators have about using PROs as prominent outcomes in IPF drug trials. Additional research on existing instruments is needed to establish or bolster their basic psychometric properties in IPF. When PROs are used as end points in therapeutic trials, analyzing PRO response data can be challenging, but these challenges can be overcome with a transparent, thoughtful, and sophisticated statistical approach. In this article, we discuss some of the basics of PRO assessment, existing knowledge gaps in IPF-related PRO research, and the potential usefulness of using PROs in IPF trials and conclude by offering specific recommendations for an approach to analyzing repeated-measures PRO data from IPF trials.
Analysis of Meniscus Fluctuation in a Continuous Casting Slab Mold
NASA Astrophysics Data System (ADS)
Zhang, Kaitian; Liu, Jianhua; Cui, Heng; Xiao, Chao
2018-06-01
A water model of slab mold was established to analyze the microscopic and macroscopic fluctuation of meniscus. The fast Fourier transform and wavelet entropy were adopted to analyze the wave amplitude, frequency, and components of fluctuation. The flow patterns under the meniscus were measured by using particle image velocimetry measurement and then the mechanisms of meniscus fluctuation were discussed. The results reflected that wavelet entropy had multi-scale and statistical properties, and it was suitable for the study of meniscus fluctuation details both in time and frequency domain. The basic wave, frequency of which exceeding 1 Hz in the condition of no mold oscillation, was demonstrated in this work. In fact, three basic waves were found: long-wave with low frequency, middle-wave with middle frequency, and short-wave with high frequency. In addition, the upper roll flow in mold had significant effect on meniscus fluctuation. When the position of flow impinged was far from the meniscus, long-wave dominated the fluctuation and the stability of meniscus was enhanced. However, when the velocity of flow was increased, the short-wave dominated the meniscus fluctuation and the meniscus stability was decreased.
Annual statistical report 2008 : based on data from CARE/EC
DOT National Transportation Integrated Search
2008-10-31
This Annual Statistical Report provides the basic characteristics of road accidents in 19 member states of : the European Union for the period 1997-2006, on the basis of data collected and processed in the CARE : database, the Community Road Accident...
Country Education Profiles: Algeria.
ERIC Educational Resources Information Center
International Bureau of Education, Geneva (Switzerland).
One of a series of profiles prepared by the Cooperative Educational Abstracting Service, this brief outline provides basic background information on educational principles, system of administration, structure and organization, curricula, and teacher training in Algeria. Statistics provided by the Unesco Office of Statistics show enrollment at all…
78 FR 23158 - Organization and Delegation of Duties
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-18
... management actions of major significance, such as those relating to changes in basic organization pattern... regard to rulemaking, enforcement, vehicle safety research and statistics and data analysis, provides... Administrator for the National Center for Statistics and Analysis, and the Associate Administrator for Vehicle...
ERIC Educational Resources Information Center
Hobden, Sally
2014-01-01
Information on the HIV/AIDS epidemic in Southern Africa is often interpreted through a veil of secrecy and shame and, I argue, with flawed understanding of basic statistics. This research determined the levels of statistical literacy evident in 316 future Mathematical Literacy teachers' explanations of the median in the context of HIV/AIDS…
Introduction to Statistics. Learning Packages in the Policy Sciences Series, PS-26. Revised Edition.
ERIC Educational Resources Information Center
Policy Studies Associates, Croton-on-Hudson, NY.
The primary objective of this booklet is to introduce students to basic statistical skills that are useful in the analysis of public policy data. A few, selected statistical methods are presented, and theory is not emphasized. Chapter 1 provides instruction for using tables, bar graphs, bar graphs with grouped data, trend lines, pie diagrams,…
Räsänen, Okko; Kakouros, Sofoklis; Soderstrom, Melanie
2018-06-06
The exaggerated intonation and special rhythmic properties of infant-directed speech (IDS) have been hypothesized to attract infants' attention to the speech stream. However, there has been little work actually connecting the properties of IDS to models of attentional processing or perceptual learning. A number of such attention models suggest that surprising or novel perceptual inputs attract attention, where novelty can be operationalized as the statistical (un)predictability of the stimulus in the given context. Since prosodic patterns such as F0 contours are accessible to young infants who are also known to be adept statistical learners, the present paper investigates a hypothesis that F0 contours in IDS are less predictable than those in adult-directed speech (ADS), given previous exposure to both speaking styles, thereby potentially tapping into basic attentional mechanisms of the listeners in a similar manner that relative probabilities of other linguistic patterns are known to modulate attentional processing in infants and adults. Computational modeling analyses with naturalistic IDS and ADS speech from matched speakers and contexts show that IDS intonation has lower overall temporal predictability even when the F0 contours of both speaking styles are normalized to have equal means and variances. A closer analysis reveals that there is a tendency of IDS intonation to be less predictable at the end of short utterances, whereas ADS exhibits more stable average predictability patterns across the full extent of the utterances. The difference between IDS and ADS persists even when the proportion of IDS and ADS exposure is varied substantially, simulating different relative amounts of IDS heard in different family and cultural environments. Exposure to IDS is also found to be more efficient for predicting ADS intonation contours in new utterances than exposure to the equal amount of ADS speech. This indicates that the more variable prosodic contours of IDS also generalize to ADS, and may therefore enhance prosodic learning in infancy. Overall, the study suggests that one reason behind infant preference for IDS could be its higher information value at the prosodic level, as measured by the amount of surprisal in the F0 contours. This provides the first formal link between the properties of IDS and the models of attentional processing and statistical learning in the brain. However, this finding does not rule out the possibility that other differences between the IDS and ADS also play a role. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Vrac, Mathieu
2018-06-01
Climate simulations often suffer from statistical biases with respect to observations or reanalyses. It is therefore common to correct (or adjust) those simulations before using them as inputs into impact models. However, most bias correction (BC) methods are univariate and so do not account for the statistical dependences linking the different locations and/or physical variables of interest. In addition, they are often deterministic, and stochasticity is frequently needed to investigate climate uncertainty and to add constrained randomness to climate simulations that do not possess a realistic variability. This study presents a multivariate method of rank resampling for distributions and dependences (R2D2) bias correction allowing one to adjust not only the univariate distributions but also their inter-variable and inter-site dependence structures. Moreover, the proposed R2D2 method provides some stochasticity since it can generate as many multivariate corrected outputs as the number of statistical dimensions (i.e., number of grid cell × number of climate variables) of the simulations to be corrected. It is based on an assumption of stability in time of the dependence structure - making it possible to deal with a high number of statistical dimensions - that lets the climate model drive the temporal properties and their changes in time. R2D2 is applied on temperature and precipitation reanalysis time series with respect to high-resolution reference data over the southeast of France (1506 grid cell). Bivariate, 1506-dimensional and 3012-dimensional versions of R2D2 are tested over a historical period and compared to a univariate BC. How the different BC methods behave in a climate change context is also illustrated with an application to regional climate simulations over the 2071-2100 period. The results indicate that the 1d-BC basically reproduces the climate model multivariate properties, 2d-R2D2 is only satisfying in the inter-variable context, 1506d-R2D2 strongly improves inter-site properties and 3012d-R2D2 is able to account for both. Applications of the proposed R2D2 method to various climate datasets are relevant for many impact studies. The perspectives of improvements are numerous, such as introducing stochasticity in the dependence itself, questioning its stability assumption, and accounting for temporal properties adjustment while including more physics in the adjustment procedures.
Environmental statistics and optimal regulation.
Sivak, David A; Thomson, Matt
2014-09-01
Any organism is embedded in an environment that changes over time. The timescale for and statistics of environmental change, the precision with which the organism can detect its environment, and the costs and benefits of particular protein expression levels all will affect the suitability of different strategies--such as constitutive expression or graded response--for regulating protein levels in response to environmental inputs. We propose a general framework-here specifically applied to the enzymatic regulation of metabolism in response to changing concentrations of a basic nutrient-to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, respectively, and the costs associated with enzyme production. We use this framework to address three fundamental questions: (i) when a cell should prefer thresholding to a graded response; (ii) when there is a fitness advantage to implementing a Bayesian decision rule; and (iii) when retaining memory of the past provides a selective advantage. We specifically find that: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.
Liu, Lu; Wei, Jianrong; Zhang, Huishu; Xin, Jianhong; Huang, Jiping
2013-01-01
Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs) and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property), but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer). The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.
75 FR 33203 - Funding Formula for Grants to States
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-11
... as Social Security numbers, birth dates, and medical data. Docket: To read or download submissions or... Local Area Unemployment Statistics (LAUS), both of which are compiled by DOL's Bureau of Labor Statistics. Specifies how each State's basic JVSG allocation is calculated. Identifies the procedures...
Statistical Considerations for Establishing CBTE Cut-Off Scores.
ERIC Educational Resources Information Center
Trzasko, Joseph A.
This report gives the basic definition and purpose of competency-based teacher education (CBTE) cut-off scores. It describes the basic characteristics of CBTE as a yes-no dichotomous decision regarding the presence of a specific ability or knowledge, which necesitates the establishment of a cut-off point to designate competency vs. incompetency on…
ADULT BASIC EDUCATION. PROGRAM SUMMARY.
ERIC Educational Resources Information Center
Office of Education (DHEW), Washington, DC.
A BRIEF DESCRIPTION IS GIVEN OF THE FEDERAL ADULT BASIC EDUCATION PROGRAM, UNDER THE ADULT EDUCATION ACT OF 1966, AT THE NATIONAL AND STATE LEVELS (INCLUDING PUERTO RICO, GUAM, AMERICAN SAMOA, AND THE VIRGIN ISLANDS) AS PROVIDED BY STATE EDUCATION AGENCIES. STATISTICS FOR FISCAL YEARS 1965 AND 1966, AND ESTIMATES FOR FISCAL YEAR 1967, INDICATE…
Action Research of Computer-Assisted-Remediation of Basic Research Concepts.
ERIC Educational Resources Information Center
Packard, Abbot L.; And Others
This study investigated the possibility of creating a computer-assisted remediation program to assist students having difficulties in basic college research and statistics courses. A team approach involving instructors and students drove the research into and creation of the computer program. The effect of student use was reviewed by looking at…
Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.
ERIC Educational Resources Information Center
Blakeslee, David W.; And Others
This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-D-0419... who conduct studies using active controls and have a basic understanding of statistical principles... clinical investigators who conduct studies using active controls and have a basic understanding of...
Force per cross-sectional area from molecules to muscles: a general property of biological motors
Meyer-Vernet, Nicole
2016-01-01
We propose to formally extend the notion of specific tension, i.e. force per cross-sectional area—classically used for muscles, to quantify forces in molecular motors exerting various biological functions. In doing so, we review and compare the maximum tensions exerted by about 265 biological motors operated by about 150 species of different taxonomic groups. The motors considered range from single molecules and motile appendages of microorganisms to whole muscles of large animals. We show that specific tensions exerted by molecular and non-molecular motors follow similar statistical distributions, with in particular, similar medians and (logarithmic) means. Over the 1019 mass (M) range of the cell or body from which the motors are extracted, their specific tensions vary as Mα with α not significantly different from zero. The typical specific tension found in most motors is about 200 kPa, which generalizes to individual molecular motors and microorganisms a classical property of macroscopic muscles. We propose a basic order-of-magnitude interpretation of this result. PMID:27493785
A dissipative random velocity field for fully developed fluid turbulence
NASA Astrophysics Data System (ADS)
Chevillard, Laurent; Pereira, Rodrigo; Garban, Christophe
2016-11-01
We investigate the statistical properties, based on numerical simulations and analytical calculations, of a recently proposed stochastic model for the velocity field of an incompressible, homogeneous, isotropic and fully developed turbulent flow. A key step in the construction of this model is the introduction of some aspects of the vorticity stretching mechanism that governs the dynamics of fluid particles along their trajectory. An additional further phenomenological step aimed at including the long range correlated nature of turbulence makes this model depending on a single free parameter that can be estimated from experimental measurements. We confirm the realism of the model regarding the geometry of the velocity gradient tensor, the power-law behaviour of the moments of velocity increments, including the intermittent corrections, and the existence of energy transfers across scales. We quantify the dependence of these basic properties of turbulent flows on the free parameter and derive analytically the spectrum of exponents of the structure functions in a simplified non dissipative case. A perturbative expansion shows that energy transfers indeed take place, justifying the dissipative nature of this random field.
Evolution-Based Functional Decomposition of Proteins
Rivoire, Olivier; Reynolds, Kimberly A.; Ranganathan, Rama
2016-01-01
The essential biological properties of proteins—folding, biochemical activities, and the capacity to adapt—arise from the global pattern of interactions between amino acid residues. The statistical coupling analysis (SCA) is an approach to defining this pattern that involves the study of amino acid coevolution in an ensemble of sequences comprising a protein family. This approach indicates a functional architecture within proteins in which the basic units are coupled networks of amino acids termed sectors. This evolution-based decomposition has potential for new understandings of the structural basis for protein function. To facilitate its usage, we present here the principles and practice of the SCA and introduce new methods for sector analysis in a python-based software package (pySCA). We show that the pattern of amino acid interactions within sectors is linked to the divergence of functional lineages in a multiple sequence alignment—a model for how sector properties might be differentially tuned in members of a protein family. This work provides new tools for studying proteins and for generally testing the concept of sectors as the principal units of function and adaptive variation. PMID:27254668
Olea, Ricardo A.; Luppens, James A.
2015-01-01
Coal is a chemically complex commodity that often contains most of the natural elements in the periodic table. Coal constituents are conventionally grouped into four components (proximate analysis): fixed carbon, ash, inherent moisture, and volatile matter. These four parts, customarily measured as weight losses and expressed as percentages, share all properties and statistical challenges of compositional data. Consequently, adequate modeling should be done in terms of a logratio transformation, a requirement that is commonly overlooked by modelers. The transformation of choice is the isometric logratio transformation because of its geometrical and statistical advantages. The modeling is done through a series of realizations prepared by applying sequential simulation for the purpose of displaying the parts in maps incorporating uncertainty. The approach makes realistic assumptions and the results honor the data and basic considerations, such as percentages between 0 and 100, all four parts adding to 100% at any location in the study area, and a style of spatial fluctuation in the realizations equal to that of the data. The realizations are used to prepare different results, including probability distributions across a deposit, E-type maps displaying average properties, and probability maps summarizing joint fluctuations of several parts. Application of these maps to a lignite bed clearly delineates the deposit boundary, reveals a channel cutting across, and shows that the most favorable coal quality is to the north and deteriorates toward the southeast.
NASA Astrophysics Data System (ADS)
Stoppe, N.; Horn, R.
2017-01-01
A basic understanding of soil behavior on the mesoscale resp. macroscale (i.e. soil aggregates resp. bulk soil) requires knowledge of the processes at the microscale (i.e. particle scale), therefore rheological investigations of natural soils receive growing attention. In the present research homogenized and sieved (< 2 mm) samples from Marshland soils of the riparian zone of the River Elbe (North Germany) were analyzed with a modular compact rheometer MCR 300 (Anton Paar, Ostfildern, Germany) with a profiled parallel-plate measuring system. Amplitude sweep tests (AST) with controlled shear deformation were conducted to investigate the viscoelastic properties of the studied soils under oszillatory stress. The gradual depletion of microstructural stiffness during AST cannot only be characterized by the well-known rheological parameters G, G″ and tan δ but also by the dimensionless area parameter integral z, which quantifies the elasticity of microstructure. To discover the physicochemical parameters, which influences the microstructural stiffness, statistical tests were used taking the combined effects of these parameters into account. Although the influence of the individual factors varies depending on soil texture, the physicochemical features significantly affecting soil micro structure were identified. Based on the determined statistical relationships between rheological and physicochemical parameters, pedotransfer functions (PTF) have been developed, which allow a mathematical estimation of the rheological target value integral z. Thus, stabilizing factors are: soil organic matter, concentration of Ca2+, content of CaCO3 and pedogenic iron oxides; whereas the concentration of Na+ and water content represent structurally unfavorable factors.
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
Li, Qiuping; Lin, Yi; Hu, Caiping; Xu, Yinghua; Zhou, Huiya; Yang, Liping; Xu, Yongyong
2016-12-01
The Hospital Anxiety and Depression Scale (HADS) acts as one of the most frequently used self-reported measures in cancer practice. The evidence for construct validity of HADS, however, remains inconclusive. The objective of this study is to evaluate the psychometric properties of the Chinese version HADS (C-HADS) in terms of construct validity, internal consistency reliability, and concurrent validity in dyads of Chinese cancer patients and their family caregivers. This was a cross-sectional study, conducted in multiple centers: one hospital in each of the seven different administrative regions in China from October 2014 to May 2015. A total of 641 dyads, consisting of cancer patients and family caregivers, completed a survey assessing their demographic and background information, anxiety and depression using C-HADS, and quality of life (QOL) using Chinese version SF-12. Data analysis methods included descriptive statistics, confirmatory factor analysis (CFA), and Pearson correlations. Both the two-factor and one-factor models offered the best and adequate fit to the data in cancer patients and family caregivers respectively. The comparison of the two-factor and single-factor models supports the basic assumption of two-factor construct of C-HADS. The overall and two subscales of C-HADS in both cancer patients and family caregivers had good internal consistency and acceptable concurrent validity. The Chinese version of the HADS may be a reliable and valid screening tool, as indicated by its original two-factor structure. The finding supports the basic assumption of two-factor construct of HADS. Copyright © 2016 Elsevier Ltd. All rights reserved.
Peers versus professional training of basic life support in Syria: a randomized controlled trial.
Abbas, Fatima; Sawaf, Bisher; Hanafi, Ibrahem; Hajeer, Mohammad Younis; Zakaria, Mhd Ismael; Abbas, Wafaa; Alabdeh, Fadi; Ibrahim, Nazir
2018-06-18
Peer training has been identified as a useful tool for delivering undergraduate training in basic life support (BLS) which is fundamental as an initial response in cases of emergency. This study aimed to (1) Evaluate the efficacy of peer-led model in basic life support training among medical students in their first three years of study, compared to professional-led training and (2) To assess the efficacy of the course program and students' satisfaction of peer-led training. A randomized controlled trial with blinded assessors was conducted on 72 medical students from the pre-clinical years (1st to 3rd years in Syria) at Syrian Private University. Students were randomly assigned to peer-led or to professional-led training group for one-day-course of basic life support skills. Sixty-four students who underwent checklist based assessment using objective structured clinical examination design (OSCE) (practical assessment of BLS skills) and answered BLS knowledge checkpoint-questionnaire were included in the analysis. There was no statistically significant difference between the two groups in delivering BLS skills to medical students in practical (P = 0.850) and BLS knowledge questionnaire outcomes (P = 0.900). Both groups showed statistically significant improvement from pre- to post-course assessment with significant statistical difference in both practical skills and theoretical knowledge (P-Value < 0.001). Students were satisfied with the peer model of training. Peer-led training of basic life support for medical students was beneficial and it provided a quality of education which was as effective as training conducted by professionals. This method is applicable and desirable especially in poor-resource countries and in crisis situation.
An adaptive approach to the dynamic allocation of buffer storage. M.S. Thesis
NASA Technical Reports Server (NTRS)
Crooke, S. C.
1970-01-01
Several strategies for the dynamic allocation of buffer storage are simulated and compared. The basic algorithms investigated, using actual statistics observed in the Univac 1108 EXEC 8 System, include the buddy method and the first-fit method. Modifications are made to the basic methods in an effort to improve and to measure allocation performance. A simulation model of an adaptive strategy is developed which permits interchanging the two different methods, the buddy and the first-fit methods with some modifications. Using an adaptive strategy, each method may be employed in the statistical environment in which its performance is superior to the other method.
Kim, Jeong Yun; Hwang, Tae Gyu; Woo, Sung Wun; Lee, Jae Moon; Namgoong, Jin Woong; Yuk, Sim Bum; Chung, Sei-Won; Kim, Jae Pil
2017-04-06
A simple and easy solubility enhancement of basic dyes was performed with bulky and symmetric weakly coordinating anions (WCAs). The WCAs decreased the ionic character of the dyes by broadening the partial charge distribution and causing a screening effect on the ionic bonding. This new modification with WCAs has advantages in that it has no influence on the optical properties of the dyes. The solubilities of unmodified and modified dyes were tested in several organic solvents. X-ray powder diffraction patterns of the dyes were measured. Color films were prepared with the dyes and their color loci were analyzed to evaluate the optical properties. By the modification with WCAs, commercial basic dyes showed sufficient solubilities for be applied to various applications while preserving their superior optical properties.
Ultrasound Dopplerography of abdomen pathology using statistical computer programs
NASA Astrophysics Data System (ADS)
Dmitrieva, Irina V.; Arakelian, Sergei M.; Wapota, Alberto R. W.
1998-04-01
The modern ultrasound dopplerography give us the big possibilities in investigation of gemodynamical changes in all stages of abdomen pathology. Many of researches devoted to using of noninvasive methods in practical medicine. Now ultrasound Dopplerography is one of the basic one. We investigated 250 patients from 30 to 77 ages, including 149 men and 101 women. The basic diagnosis of all patients was the Ischaemic Pancreatitis. The Second diagnoses of pathology were the Ischaemic Disease of Heart, Gypertension, Atherosclerosis, Diabet, Vascular Disease of Extremities. We researched the abdominal aorta and her branches: Arteria Mesenterica Superior (AMS), truncus coeliacus (TC), arteria hepatica communis (AHC), arteria lienalis (AL). For investigation we use the following equipment: ACUSON 128 XP/10c, BIOMEDIC, GENERAL ELECTRIC (USA, Japan). We analyzed the following componetns of gemodynamical changes of abdominal vessels: index of pulsation, index of resistance, ratio of systol-dystol, speed of blood circulation. Statistical program included the following one: 'basic statistic's,' 'analytic program.' In conclusion we determined that the all gemodynamical components of abdominal vessels had considerable changes in abdominal ischaemia than in normal situation. Using the computer's program for definition degree of gemodynamical changes, we can recommend the individual plan of diagnostical and treatment program.
Resilience Among Students at the Basic Enlisted Submarine School
2016-12-01
reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well
A Computational Approach to Investigate Properties of Estimators
ERIC Educational Resources Information Center
Caudle, Kyle A.; Ruth, David M.
2013-01-01
Teaching undergraduates the basic properties of an estimator can be difficult. Most definitions are easy enough to comprehend, but difficulties often lie in gaining a "good feel" for these properties and why one property might be more desired as compared to another property. Simulations which involve visualization of these properties can…
Electrical properties of epoxies used in hybrid microelectronics
NASA Technical Reports Server (NTRS)
Stout, C. W.
1976-01-01
The electrical properties and basic characteristics of the structure of conductive epoxies were studied. The results of the experimental work performed to measure the electrical properties of epoxies are presented.
A Simple Statistical Thermodynamics Experiment
ERIC Educational Resources Information Center
LoPresto, Michael C.
2010-01-01
Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…
76 FR 41756 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-15
... materials and supplies used in production. The economic census will produce basic statistics by kind of business on number of establishments, sales, payroll, employment, inventories, and operating expenses. It also will yield a variety of subject statistics, including sales by product line; sales by class of...
ERIC Educational Resources Information Center
Swinson, John V.
2000-01-01
Intellectual property is a term that covers a number of different rights. Considers issues such as what are the basic forms of intellectual property; who owns the intellectual property created by a teacher; who owns intellectual property created by students; and use of downloaded materials from the internet. (Author/LM)
Vetter, Thomas R
2017-11-01
Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
`New insight into statistical hydrology' preface to the special issue
NASA Astrophysics Data System (ADS)
Kochanek, Krzysztof
2018-04-01
Statistical methods are still the basic tool for investigating random, extreme events occurring in hydrosphere. On 21-22 September 2017, in Warsaw (Poland) the international workshop of the Statistical Hydrology (StaHy) 2017 took place under the auspices of the International Association of Hydrological Sciences. The authors of the presentations proposed to publish their research results in the Special Issue of the Acta Geophysica-`New Insight into Statistical Hydrology'. Five papers were selected for publication, touching on the most crucial issues of statistical methodology in hydrology.
41 CFR 301-73.106 - What are the basic services that should be covered by a TMS?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., confirmation of reservations, etc.). (b) Provide basic management information, such as— (1) Number of... 41 Public Contracts and Property Management 4 2010-07-01 2010-07-01 false What are the basic... Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES AGENCY RESPONSIBILITIES 73...
41 CFR 102-73.10 - What is the basic real estate acquisition policy?
Code of Federal Regulations, 2013 CFR
2013-07-01
... ESTATE ACQUISITION General Provisions § 102-73.10 What is the basic real estate acquisition policy? When... real estate and related services in an efficient and cost effective manner. ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What is the basic real...
41 CFR 102-73.10 - What is the basic real estate acquisition policy?
Code of Federal Regulations, 2012 CFR
2012-01-01
... ESTATE ACQUISITION General Provisions § 102-73.10 What is the basic real estate acquisition policy? When... real estate and related services in an efficient and cost effective manner. ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What is the basic real...
41 CFR 102-73.10 - What is the basic real estate acquisition policy?
Code of Federal Regulations, 2014 CFR
2014-01-01
... ESTATE ACQUISITION General Provisions § 102-73.10 What is the basic real estate acquisition policy? When... real estate and related services in an efficient and cost effective manner. ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What is the basic real...
41 CFR 102-73.10 - What is the basic real estate acquisition policy?
Code of Federal Regulations, 2010 CFR
2010-07-01
... ESTATE ACQUISITION General Provisions § 102-73.10 What is the basic real estate acquisition policy? When... real estate and related services in an efficient and cost effective manner. ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What is the basic real...
41 CFR 102-73.10 - What is the basic real estate acquisition policy?
Code of Federal Regulations, 2011 CFR
2011-01-01
... ESTATE ACQUISITION General Provisions § 102-73.10 What is the basic real estate acquisition policy? When... real estate and related services in an efficient and cost effective manner. ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What is the basic real...
An Independent Evaluation of the Technical Features of the Basic Reading Inventory
ERIC Educational Resources Information Center
Bieber, Gregg; Hulac, David M.; Schweinle, William
2015-01-01
The present study investigated some psychometric properties of the Basic Reading Inventory (BRI), a widely used informal reading inventory. The BRI and Dynamic Indicators of Basic Early Literacy Skills (DIBELS) probes were administered to 149 third, fourth, and fifth graders. Test--retest and alternate forms reliability analyses indicated adequate…
Image Statistics and the Representation of Material Properties in the Visual Cortex
Baumgartner, Elisabeth; Gegenfurtner, Karl R.
2016-01-01
We explored perceived material properties (roughness, texturedness, and hardness) with a novel approach that compares perception, image statistics and brain activation, as measured with fMRI. We initially asked participants to rate 84 material images with respect to the above mentioned properties, and then scanned 15 of the participants with fMRI while they viewed the material images. The images were analyzed with a set of image statistics capturing their spatial frequency and texture properties. Linear classifiers were then applied to the image statistics as well as the voxel patterns of visually responsive voxels and early visual areas to discriminate between images with high and low perceptual ratings. Roughness and texturedness could be classified above chance level based on image statistics. Roughness and texturedness could also be classified based on the brain activation patterns in visual cortex, whereas hardness could not. Importantly, the agreement in classification based on image statistics and brain activation was also above chance level. Our results show that information about visual material properties is to a large degree contained in low-level image statistics, and that these image statistics are also partially reflected in brain activity patterns induced by the perception of material images. PMID:27582714
Image Statistics and the Representation of Material Properties in the Visual Cortex.
Baumgartner, Elisabeth; Gegenfurtner, Karl R
2016-01-01
We explored perceived material properties (roughness, texturedness, and hardness) with a novel approach that compares perception, image statistics and brain activation, as measured with fMRI. We initially asked participants to rate 84 material images with respect to the above mentioned properties, and then scanned 15 of the participants with fMRI while they viewed the material images. The images were analyzed with a set of image statistics capturing their spatial frequency and texture properties. Linear classifiers were then applied to the image statistics as well as the voxel patterns of visually responsive voxels and early visual areas to discriminate between images with high and low perceptual ratings. Roughness and texturedness could be classified above chance level based on image statistics. Roughness and texturedness could also be classified based on the brain activation patterns in visual cortex, whereas hardness could not. Importantly, the agreement in classification based on image statistics and brain activation was also above chance level. Our results show that information about visual material properties is to a large degree contained in low-level image statistics, and that these image statistics are also partially reflected in brain activity patterns induced by the perception of material images.
Atmospheric Effects on InSAR Measurements and Their Mitigation
Ding, Xiao-li; Li, Zhi-wei; Zhu, Jian-jun; Feng, Guang-cai; Long, Jiang-ping
2008-01-01
Interferometric Synthetic Aperture Radar (InSAR) is a powerful technology for observing the Earth surface, especially for mapping the Earth's topography and deformations. InSAR measurements are however often significantly affected by the atmosphere as the radar signals propagate through the atmosphere whose state varies both in space and in time. Great efforts have been made in recent years to better understand the properties of the atmospheric effects and to develop methods for mitigating the effects. This paper provides a systematic review of the work carried out in this area. The basic principles of atmospheric effects on repeat-pass InSAR are first introduced. The studies on the properties of the atmospheric effects, including the magnitudes of the effects determined in the various parts of the world, the spectra of the atmospheric effects, the isotropic properties and the statistical distributions of the effects, are then discussed. The various methods developed for mitigating the atmospheric effects are then reviewed, including the methods that are based on PSInSAR processing, the methods that are based on interferogram modeling, and those that are based on external data such as GPS observations, ground meteorological data, and satellite data including those from the MODIS and MERIS. Two examples that use MODIS and MERIS data respectively to calibrate atmospheric effects on InSAR are also given. PMID:27873822
Atmospheric Effects on InSAR Measurements and Their Mitigation.
Ding, Xiao-Li; Li, Zhi-Wei; Zhu, Jian-Jun; Feng, Guang-Cai; Long, Jiang-Ping
2008-09-03
Interferometric Synthetic Aperture Radar (InSAR) is a powerful technology for observing the Earth surface, especially for mapping the Earth's topography and deformations. InSAR measurements are however often significantly affected by the atmosphere as the radar signals propagate through the atmosphere whose state varies both in space and in time. Great efforts have been made in recent years to better understand the properties of the atmospheric effects and to develop methods for mitigating the effects. This paper provides a systematic review of the work carried out in this area. The basic principles of atmospheric effects on repeat-pass InSAR are first introduced. The studies on the properties of the atmospheric effects, including the magnitudes of the effects determined in the various parts of the world, the spectra of the atmospheric effects, the isotropic properties and the statistical distributions of the effects, are then discussed. The various methods developed for mitigating the atmospheric effects are then reviewed, including the methods that are based on PSInSAR processing, the methods that are based on interferogram modeling, and those that are based on external data such as GPS observations, ground meteorological data, and satellite data including those from the MODIS and MERIS. Two examples that use MODIS and MERIS data respectively to calibrate atmospheric effects on InSAR are also given.
A mini-review on econophysics: Comparative study of Chinese and western financial markets
NASA Astrophysics Data System (ADS)
Zheng, Bo; Jiang, Xiong-Fei; Ni, Peng-Yun
2014-07-01
We present a review of our recent research in econophysics, and focus on the comparative study of Chinese and western financial markets. By virtue of concepts and methods in statistical physics, we investigate the time correlations and spatial structure of financial markets based on empirical high-frequency data. We discover that the Chinese stock market shares common basic properties with the western stock markets, such as the fat-tail probability distribution of price returns, the long-range auto-correlation of volatilities, and the persistence probability of volatilities, while it exhibits very different higher-order time correlations of price returns and volatilities, spatial correlations of individual stock prices, and large-fluctuation dynamic behaviors. Furthermore, multi-agent-based models are developed to simulate the microscopic interaction and dynamic evolution of the stock markets.
The pdf approach to turbulent flow
NASA Technical Reports Server (NTRS)
Kollmann, W.
1990-01-01
This paper provides a detailed discussion of the theory and application of probability density function (pdf) methods, which provide a complete statistical description of turbulent flow fields at a single point or a finite number of points. The basic laws governing the flow of Newtonian fluids are set up in the Eulerian and the Lagrangian frame, and the exact and linear equations for the characteristic functionals in those frames are discussed. Pdf equations in both frames are derived as Fourier transforms of the equations of the characteristic functions. Possible formulations for the nonclosed terms in the pdf equation are discussed, their properties are assessed, and closure modes for the molecular-transport and the fluctuating pressure-gradient terms are reviewed. The application of pdf methods to turbulent combustion flows, supersonic flows, and the interaction of turbulence with shock waves is discussed.
Přibil, Jiří; Přibilová, Anna; Frollo, Ivan
2018-04-05
This article compares open-air and whole-body magnetic resonance imaging (MRI) equipment working with a weak magnetic field as regards the methods of its generation, spectral properties of mechanical vibration and acoustic noise produced by gradient coils during the scanning process, and the measured noise intensity. These devices are used for non-invasive MRI reconstruction of the human vocal tract during phonation with simultaneous speech recording. In this case, the vibration and noise have negative influence on quality of speech signal. Two basic measurement experiments were performed within the paper: mapping sound pressure levels in the MRI device vicinity and picking up vibration and noise signals in the MRI scanning area. Spectral characteristics of these signals are then analyzed statistically and compared visually and numerically.
Introductory physics going soft
NASA Astrophysics Data System (ADS)
Langbeheim, Elon; Livne, Shelly; Safran, Samuel A.; Yerushalmi, Edit
2012-01-01
We describe an elective course on soft matter at the level of introductory physics. Soft matter physics serves as a context that motivates the presentation of basic ideas in statistical thermodynamics and their applications. It also is an example of a contemporary field that is interdisciplinary and touches on chemistry, biology, and physics. We outline a curriculum that uses the lattice gas model as a quantitative and visual tool, initially to introduce entropy, and later to facilitate the calculation of interactions. We demonstrate how free energy minimization can be used to teach students to understand the properties of soft matter systems such as the phases of fluid mixtures, wetting of interfaces, self-assembly of surfactants, and polymers. We discuss several suggested activities in the form of inquiry projects which allow students to apply the concepts they have learned to experimental systems.
The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis
ERIC Educational Resources Information Center
Buri, Olga Elizabeth Minchala; Stefos, Efstathios
2017-01-01
The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…
Improving Attendance and Punctuality of FE Basic Skill Students through an Innovative Scheme
ERIC Educational Resources Information Center
Ade-Ojo, Gordon O.
2005-01-01
This paper reports the findings of a study set up to establish the impact of a particular scheme on the attendance and punctuality performance of a group of Basic Skills learners against the backdrop of various theoretical postulations on managing undesirable behavior. Data collected on learners' performance was subjected to statistical analysis…
ERIC Educational Resources Information Center
Applied Management Sciences, Inc., Silver Spring, MD.
The amount of misreporting of Veterans Administration (VA) benefits was assessed, along with the impact of misreporting on the Basic Educational Opportunity Grant (BEOG) program. Accurate financial information is need to determine appropriate awards. The analysis revealed: over 97% of VA beneficiaries misreported benefits; the total net loss to…
ERIC Educational Resources Information Center
Yingxiu, Yang
2006-01-01
Using statistical data on the implementing conditions of China's educational expenditure published by the state, this paper studies the Gini coefficient of the budget educational public expenditure per student in order to examine the concentration degree of the educational expenditure for China's basic education and analyze its balanced…
Tradeoffs between hydraulic and mechanical stress responses of mature Norway spruce trunk wood.
Rosner, Sabine; Klein, Andrea; Müller, Ulrich; Karlsson, Bo
2008-08-01
We tested the effects of growth characteristics and basic density on hydraulic and mechanical properties of mature Norway spruce (Picea abies (L.) Karst.) wood from six 24-year-old clones, grown on two sites in southern Sweden differing in water availability. Hydraulic parameters assessed were specific hydraulic conductivity at full saturation (ks100) and vulnerability to cavitation (Psi50), mechanical parameters included bending strength (sigma b), modulus of elasticity (MOE), compression strength (sigma a) and Young's modulus (E). Basic density, diameter at breast height, tree height, and hydraulic and mechanical parameters varied considerably among clones. Clonal means of hydraulic and mechanical properties were strongly related to basic density and to growth parameters across sites, especially to diameter at breast height. Compared with stem wood of slower growing clones, stem wood of rapidly growing clones had significantly lower basic density, lower sigma b, MOE, sigma a and E, was more vulnerable to cavitation, but had higher ks100. Basic density was negatively correlated to Psi50 and ks100. We therefore found a tradeoff between Psi50 and ks100. Clones with high basic density had significantly lower hydraulic vulnerability, but also lower hydraulic conductivity at full saturation and thus less rapid growth than clones with low basic density. This tradeoff involved a negative relationship between Psi50 and sigma b as well as MOE, and between ks100 and sigma b, MOE and sigma a. Basic density and Psi50 showed no site-specific differences, but tree height, diameter at breast height, ks100 and mechanical strength and stiffness were significantly lower at the drier site. Basic density had no influence on the site-dependent differences in hydraulic and mechanical properties, but was strongly negatively related to diameter at breast height. Selecting for growth may thus lead not only to a reduction in mechanical strength and stiffness but also to a reduction in hydraulic safety.
Ernest J. Gebhart
1980-01-01
Other members of this panel are going to reveal the basic statistics about the coal strip mining industry in Ohio so I will confine my remarks to the revegetation of the spoil banks. So it doesn't appear that Ohio confined its tree planting efforts to spoil banks alone, I will rely on a few statistics.
Idaho State University Statistical Portrait, Academic Year 1998-1999.
ERIC Educational Resources Information Center
Idaho State Univ., Pocatello. Office of Institutional Research.
This report provides basic statistical data for Idaho State University, and includes both point-of-time data as well as trend data. The information is divided into sections emphasizing students, programs, faculty and staff, finances, and physical facilities. Student data includes enrollment, geographical distribution, student/faculty ratios,…
Statistical Report. Fiscal Year 1995: September 1, 1994 - August 31, 1995.
ERIC Educational Resources Information Center
Texas Higher Education Coordinating Board, Austin.
This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1995. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1990-94 headcount data; headcount by classification, ethnic origin,…
Statistical Report. Fiscal Year 1994: September 1, 1993 - August 31, 1994.
ERIC Educational Resources Information Center
Texas Higher Education Coordinating Board, Austin.
This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1994. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1989-93 headcount data; headcount by classification, ethnic origin,…
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2011 CFR
2011-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2013 CFR
2013-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2014 CFR
2014-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2012 CFR
2012-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
Theoretical Frameworks for Math Fact Fluency
ERIC Educational Resources Information Center
Arnold, Katherine
2012-01-01
Recent education statistics indicate persistent low math scores for our nation's students. This drop in math proficiency includes deficits in basic number sense and automaticity of math facts. The decrease has been recorded across all grade levels with the elementary levels showing the greatest loss (National Center for Education Statistics,…
Basic Statistical Concepts and Methods for Earth Scientists
Olea, Ricardo A.
2008-01-01
INTRODUCTION Statistics is the science of collecting, analyzing, interpreting, modeling, and displaying masses of numerical data primarily for the characterization and understanding of incompletely known systems. Over the years, these objectives have lead to a fair amount of analytical work to achieve, substantiate, and guide descriptions and inferences.
NASA Astrophysics Data System (ADS)
Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.
2018-01-01
Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.
Sánchez-Rodríguez, Dolores; Marco, Ester; Ronquillo-Moreno, Natalia; Maciel-Bravo, Liev; Gonzales-Carhuancho, Abel; Duran, Xavier; Guillén-Solà, Anna; Vázquez-Ibar, Olga; Escalada, Ferran; Muniesa, Josep M
2018-01-25
The aim of this study was to assess the prevalence of malnutrition by applying the ASPEN/AND definition and the ESPEN consensus definition in a postacute-care population, and secondly, to determine the metrological properties of the set of six clinical characteristics that constitute the ASPEN/AND basic diagnosis, compared to the ESPEN consensus, based mostly on objective anthropometric measurements. Prospective study of 84 consecutive deconditioned older inpatients (85.4 ± 6.2; 59.5% women) admitted for rehabilitation in postacute care. ASPEN/AND diagnosis of malnutrition was considered in presence of at least two of the following: low energy intake, fluid accumulation, diminished handgrip strength, and loss of weight, muscle mass, or subcutaneous fat. Sensitivity, specificity, positive and negative predictive values, accuracy, likelihood ratios, and kappa statistics were calculated for ASPEN/AND criteria and compared with ESPEN consensus. The prevalence of malnutrition by ASPEN/AND criteria was 63.1% and by ESPEN consensus, 20.2%; both diagnoses were associated with significantly longer length of stay, but the ESPEN definition was significantly associated with poorer functional outcomes after the rehabilitation program. Compared to ESPEN consensus, ASPEN/AND diagnosis showed fair validity (sensitivity = 94.1%; specificity = 44.8%); kappa statistic was 2.217. Applying the ASPEN/AND definition obtained a higher prevalence of malnutrition in a postacute-care population than was identified by the ESPEN definition. ASPEN/AND criteria had fair validity and agreement compared with the ESPEN definition. A simple, evidence-based, unified malnutrition definition might improve geriatric care. Copyright © 2018 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
Waxman, S R; Lynch, E B; Casey, K L; Baer, L
1997-11-01
Basic level categories are a rich source of inductive inference for children and adults. These 3 experiments examine how preschool-age children partition their inductively rich basic level categories to form subordinate level categories and whether these have inductive potential. Children were taught a novel property about an individual member of a familiar basic level category (e.g., a collie). Then, children's extensions of that property to other objects from the same subordinate (e.g., other collies), basic (e.g., other dogs), and superordinate (e.g., other animals) level categories were examined. The results suggest (a) that contrastive information promotes the emergence of subordinate categories as a basis of inductive inference and (b) that newly established subordinate categories can retain their inductive potential in subsequent reasoning over a week's time.
Vermeeren, Günter; Joseph, Wout; Martens, Luc
2013-04-01
Assessing the whole-body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi-path exposure method for heterogeneous realistic human body models. The method is applied for the 6-year-old Virtual Family boy (VFB) exposed to the GSM downlink at 950 MHz. It is shown that the whole-body SAR does not differ significantly over the different environments at an operating frequency of 950 MHz. Furthermore, the whole-body SAR in the VFB for multi-path exposure exceeds the whole-body SAR for worst-case single-incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950 MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole-body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. Copyright © 2012 Wiley Periodicals, Inc.
[Research on basic questions of intellectual property rights of acupuncture and moxibustion].
Dong, Guo-Feng; Wu, Xiao-Dong; Han, Yan-Jing; Meng, Hong; Wang, Xin
2011-12-01
Along with the modernization and internationalization of acupuncture-moxibustion (acu-moxibustion), the issue of intellectual property rights has been becoming prominent and remarkable increasingly. In the present paper, the authors explain the basic issues of acu-moxibustion learning from the concept, scope, subject, object, contents and acquisition way of intellectual property rights. To make clear these questions will help us inherit and carry forward the existing civilization achievements of acu-moxibustion, and unceasingly bring forth new ideas and further improvement in clinical application, so as to serve the people's health in a better way.
Rebuilding Government Legitimacy in Post-conflict Societies: Case Studies of Nepal and Afghanistan
2015-09-09
administered via the verbal scales due to reduced time spent explaining the visual show cards. Statistical results corresponded with observations from...a three-step strategy for dealing with item non-response. First, basic descriptive statistics are calculated to determine the extent of item...descriptive statistics for all items in the survey), however this section of the report highlights just some of the findings. Thus, the results
Biostatistical and medical statistics graduate education
2014-01-01
The development of graduate education in biostatistics and medical statistics is discussed in the context of training within a medical center setting. The need for medical researchers to employ a wide variety of statistical designs in clinical, genetic, basic science and translational settings justifies the ongoing integration of biostatistical training into medical center educational settings and informs its content. The integration of large data issues are a challenge. PMID:24472088
The Cryogenic Properties of Several Aluminum-Beryllium Alloys and a Beryllium Oxide Material
NASA Technical Reports Server (NTRS)
Gamwell, Wayne R.; McGill, Preston B.
2003-01-01
Performance related mechanical properties for two aluminum-beryllium (Al-Be) alloys and one beryllium-oxide (BeO) material were developed at cryogenic temperatures. Basic mechanical properties (Le., ultimate tensile strength, yield strength, percent elongation, and elastic modulus were obtained for the aluminum-beryllium alloy, AlBeMetl62 at cryogenic [-195.5"C (-320 F) and -252.8"C (-423"F)I temperatures. Basic mechanical properties for the Be0 material were obtained at cyrogenic [- 252.8"C (-423"F)] temperatures. Fracture properties were obtained for the investment cast alloy Beralcast 363 at cryogenic [-252.8"C (-423"F)] temperatures. The AlBeMetl62 material was extruded, the Be0 material was hot isostatic pressing (HIP) consolidated, and the Beralcast 363 material was investment cast.
Music Tune Restoration Based on a Mother Wavelet Construction
NASA Astrophysics Data System (ADS)
Fadeev, A. S.; Konovalov, V. I.; Butakova, T. I.; Sobetsky, A. V.
2017-01-01
It is offered to use the mother wavelet function obtained from the local part of an analyzed music signal. Requirements for the constructed function are proposed and the implementation technique and its properties are described. The suggested approach allows construction of mother wavelet families with specified identifying properties. Consequently, this makes possible to identify the basic signal variations of complex music signals including local time-frequency characteristics of the basic one.
Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763
Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.
ERIC Educational Resources Information Center
Umar, Yunusa
2014-01-01
A simple and effective hands-on classroom activity designed to illustrate basic polymer concepts is presented. In this activity, students build primary structures of homopolymers and different arrangements of monomers in copolymer using paper clips as monomers. The activity supports formation of a basic understanding of polymer structures,…
41 CFR 102-85.25 - What is the basic principle governing OAs?
Code of Federal Regulations, 2010 CFR
2010-07-01
... principle governing OAs? 102-85.25 Section 102-85.25 Public Contracts and Property Management Federal... POLICY FOR OCCUPANCY IN GSA SPACE Pricing Policy-General § 102-85.25 What is the basic principle governing OAs? The basic principle governing OAs is to adopt the private sector practice of capturing in a...
Contract Award on Initial Proposals
1988-09-30
3 2. Competition in Contracting Act ... ......... 6 3. Federal Property and Administrative Services Act 10 B. Basic Rules for Award Without...Discussions Before CICA . 11 C. Basic Rules for Award Without Discussions After Passage of CICA .......... ........................ ... 12 D. Award...controlled by statute. This chapter will explore those statutes and their antecedents. The basic rules for awarding contracts without discussions
Mass Uncertainty and Application For Space Systems
NASA Technical Reports Server (NTRS)
Beech, Geoffrey
2013-01-01
Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.
Views of medical students: what, when and how do they want statistics taught?
Fielding, S; Poobalan, A; Prescott, G J; Marais, D; Aucott, L
2015-11-01
A key skill for a practising clinician is being able to do research, understand the statistical analyses and interpret results in the medical literature. Basic statistics has become essential within medical education, but when, what and in which format is uncertain. To inform curriculum design/development we undertook a quantitative survey of fifth year medical students and followed them up with a series of focus groups to obtain their opinions as to what statistics teaching they want, when and how. A total of 145 students undertook the survey and five focus groups were held with between 3 and 9 participants each. Previous statistical training varied and students recognised their knowledge was inadequate and keen to see additional training implemented. Students were aware of the importance of statistics to their future careers, but apprehensive about learning. Face-to-face teaching supported by online resources was popular. Focus groups indicated the need for statistical training early in their degree and highlighted their lack of confidence and inconsistencies in support. The study found that the students see the importance of statistics training in the medical curriculum but that timing and mode of delivery are key. The findings have informed the design of a new course to be implemented in the third undergraduate year. Teaching will be based around published studies aiming to equip students with the basics required with additional resources available through a virtual learning environment. © The Author(s) 2015.
Design of order statistics filters using feedforward neural networks
NASA Astrophysics Data System (ADS)
Maslennikova, Yu. S.; Bochkarev, V. V.
2016-08-01
In recent years significant progress have been made in the development of nonlinear data processing techniques. Such techniques are widely used in digital data filtering and image enhancement. Many of the most effective nonlinear filters based on order statistics. The widely used median filter is the best known order statistic filter. Generalized form of these filters could be presented based on Lloyd's statistics. Filters based on order statistics have excellent robustness properties in the presence of impulsive noise. In this paper, we present special approach for synthesis of order statistics filters using artificial neural networks. Optimal Lloyd's statistics are used for selecting of initial weights for the neural network. Adaptive properties of neural networks provide opportunities to optimize order statistics filters for data with asymmetric distribution function. Different examples demonstrate the properties and performance of presented approach.
NASA Astrophysics Data System (ADS)
Eisenbach, Markus
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn-Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. We present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. Using the Cray XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code. This work has been sponsored by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Material Sciences and Engineering Division and by the Office of Advanced Scientific Computing. This work used resources of the Oak Ridge Leadership Computing Facility, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.
NASA Astrophysics Data System (ADS)
Mugnes, J.-M.; Robert, C.
2015-11-01
Spectral analysis is a powerful tool to investigate stellar properties and it has been widely used for decades now. However, the methods considered to perform this kind of analysis are mostly based on iteration among a few diagnostic lines to determine the stellar parameters. While these methods are often simple and fast, they can lead to errors and large uncertainties due to the required assumptions. Here, we present a method based on Bayesian statistics to find simultaneously the best combination of effective temperature, surface gravity, projected rotational velocity, and microturbulence velocity, using all the available spectral lines. Different tests are discussed to demonstrate the strength of our method, which we apply to 54 mid-resolution spectra of field and cluster B stars obtained at the Observatoire du Mont-Mégantic. We compare our results with those found in the literature. Differences are seen which are well explained by the different methods used. We conclude that the B-star microturbulence velocities are often underestimated. We also confirm the trend that B stars in clusters are on average faster rotators than field B stars.
The role of shape complexity in the detection of closed contours.
Wilder, John; Feldman, Jacob; Singh, Manish
2016-09-01
The detection of contours in noise has been extensively studied, but the detection of closed contours, such as the boundaries of whole objects, has received relatively little attention. Closed contours pose substantial challenges not present in the simple (open) case, because they form the outlines of whole shapes and thus take on a range of potentially important configural properties. In this paper we consider the detection of closed contours in noise as a probabilistic decision problem. Previous work on open contours suggests that contour complexity, quantified as the negative log probability (Description Length, DL) of the contour under a suitably chosen statistical model, impairs contour detectability; more complex (statistically surprising) contours are harder to detect. In this study we extended this result to closed contours, developing a suitable probabilistic model of whole shapes that gives rise to several distinct though interrelated measures of shape complexity. We asked subjects to detect either natural shapes (Exp. 1) or experimentally manipulated shapes (Exp. 2) embedded in noise fields. We found systematic effects of global shape complexity on detection performance, demonstrating how aspects of global shape and form influence the basic process of object detection. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhdankin, Vladimir; Uzdensky, Dmitri A.; Werner, Gregory R.; Begelman, Mitchell C.
2018-02-01
We describe results from particle-in-cell simulations of driven turbulence in collisionless, magnetized, relativistic pair plasma. This physical regime provides a simple setting for investigating the basic properties of kinetic turbulence and is relevant for high-energy astrophysical systems such as pulsar wind nebulae and astrophysical jets. In this paper, we investigate the statistics of turbulent fluctuations in simulations on lattices of up to 10243 cells and containing up to 2 × 1011 particles. Due to the absence of a cooling mechanism in our simulations, turbulent energy dissipation reduces the magnetization parameter to order unity within a few dynamical times, causing turbulent motions to become sub-relativistic. In the developed stage, our results agree with predictions from magnetohydrodynamic turbulence phenomenology at inertial-range scales, including a power-law magnetic energy spectrum with index near -5/3, scale-dependent anisotropy of fluctuations described by critical balance, lognormal distributions for particle density and internal energy density (related by a 4/3 adiabatic index, as predicted for an ultra-relativistic ideal gas), and the presence of intermittency. We also present possible signatures of a kinetic cascade by measuring power-law spectra for the magnetic, electric and density fluctuations at sub-Larmor scales.
Generalization of symmetric α-stable Lévy distributions for q >1
NASA Astrophysics Data System (ADS)
Umarov, Sabir; Tsallis, Constantino; Gell-Mann, Murray; Steinberg, Stanly
2010-03-01
The α-stable distributions introduced by Lévy play an important role in probabilistic theoretical studies and their various applications, e.g., in statistical physics, life sciences, and economics. In the present paper we study sequences of long-range dependent random variables whose distributions have asymptotic power-law decay, and which are called (q,α)-stable distributions. These sequences are generalizations of independent and identically distributed α-stable distributions and have not been previously studied. Long-range dependent (q,α)-stable distributions might arise in the description of anomalous processes in nonextensive statistical mechanics, cell biology, finance. The parameter q controls dependence. If q =1 then they are classical independent and identically distributed with α-stable Lévy distributions. In the present paper we establish basic properties of (q,α)-stable distributions and generalize the result of Umarov et al. [Milan J. Math. 76, 307 (2008)], where the particular case α =2,qɛ[1,3) was considered, to the whole range of stability and nonextensivity parameters α ɛ(0,2] and q ɛ[1,3), respectively. We also discuss possible further extensions of the results that we obtain and formulate some conjectures.
The Impact of Natural Hazards such as Turbulent Wind Gusts on the Wind Energy Conversion Process
NASA Astrophysics Data System (ADS)
Wächter, M.; Hölling, M.; Milan, P.; Morales, A.; Peinke, J.
2012-12-01
Wind turbines operate in the atmospheric boundary layer, where they are exposed to wind gusts and other types of natural hazards. As the response time of wind turbines is typically in the range of seconds, they are affected by the small scale intermittent properties of the turbulent wind. We show evidence that basic features which are known for small-scale homogeneous isotropic turbulence, and in particular the well-known intermittency problem, have an important impact on the wind energy conversion process. Intermittent statistics include high probabilities of extreme events which can be related to wind gusts and other types of natural hazards. As a summarizing result we find that atmospheric turbulence imposes its intermittent features on the complete wind energy conversion process. Intermittent turbulence features are not only present in atmospheric wind, but are also dominant in the loads on the turbine, i.e. rotor torque and thrust, and in the electrical power output signal. We conclude that profound knowledge of turbulent statistics and the application of suitable numerical as well as experimental methods are necessary to grasp these unique features and quantify their effects on all stages of wind energy conversion.
Assessment of stigma associated with tuberculosis in Mexico.
Moya, E M; Biswas, A; Chávez Baray, S M; Martínez, O; Lomeli, B
2014-12-21
Stigma is a major barrier to health care access and impacts the quality of life for individuals affected by tuberculosis (TB). Assessing TB stigma is essential to addressing health disparities. However, no such instrument was available in Mexico at the time of our study. This study examined the adaptability of the TB and human immunodeficiency virus (HIV) stigma scales previously used in Thailand. The original scale, developed in English, was linguistically adapted to Spanish and administered to 217 individuals affected by TB in five states in Mexico. The TB-HIV stigma subscales were designed to assess individual and community perspectives. Additional data collected included general information and socio-demographics. Assessment of psychometric properties included basic statistical tests, evaluation of Cronbach's alpha and factor analysis. We found no significant statistical differences associated with higher stigma scores by location, age, marital status, education and stigma scores. Factor analysis did not create any new factors. Internal consistency reliability coefficients were satisfactory (Cronbach α = 0.876-0.912). The use of the stigma scales has implications for 1) health improvements, 2) research on stigma and health disparities, and 3) TB and HIV stigma interventions. Further research is needed to examine transferability among larger and randomly selected Spanish-speaking populations.
Mattfeldt, Torsten
2011-04-01
Computer-intensive methods may be defined as data analytical procedures involving a huge number of highly repetitive computations. We mention resampling methods with replacement (bootstrap methods), resampling methods without replacement (randomization tests) and simulation methods. The resampling methods are based on simple and robust principles and are largely free from distributional assumptions. Bootstrap methods may be used to compute confidence intervals for a scalar model parameter and for summary statistics from replicated planar point patterns, and for significance tests. For some simple models of planar point processes, point patterns can be simulated by elementary Monte Carlo methods. The simulation of models with more complex interaction properties usually requires more advanced computing methods. In this context, we mention simulation of Gibbs processes with Markov chain Monte Carlo methods using the Metropolis-Hastings algorithm. An alternative to simulations on the basis of a parametric model consists of stochastic reconstruction methods. The basic ideas behind the methods are briefly reviewed and illustrated by simple worked examples in order to encourage novices in the field to use computer-intensive methods. © 2010 The Authors Journal of Microscopy © 2010 Royal Microscopical Society.
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Hawk, J. D.
1975-01-01
A generalized concept for cost-effective structural design is introduced. It is assumed that decisions affecting the cost effectiveness of aerospace structures fall into three basic categories: design, verification, and operation. Within these basic categories, certain decisions concerning items such as design configuration, safety factors, testing methods, and operational constraints are to be made. All or some of the variables affecting these decisions may be treated probabilistically. Bayesian statistical decision theory is used as the tool for determining the cost optimum decisions. A special case of the general problem is derived herein, and some very useful parametric curves are developed and applied to several sample structures.
On a Quantum Model of Brain Activities
NASA Astrophysics Data System (ADS)
Fichtner, K.-H.; Fichtner, L.; Freudenberg, W.; Ohya, M.
2010-01-01
One of the main activities of the brain is the recognition of signals. A first attempt to explain the process of recognition in terms of quantum statistics was given in [6]. Subsequently, details of the mathematical model were presented in a (still incomplete) series of papers (cf. [7, 2, 5, 10]). In the present note we want to give a general view of the principal ideas of this approach. We will introduce the basic spaces and justify the choice of spaces and operations. Further, we bring the model face to face with basic postulates any statistical model of the recognition process should fulfill. These postulates are in accordance with the opinion widely accepted in psychology and neurology.
Freeman, Jenny V; Collier, Steve; Staniforth, David; Smith, Kevin J
2008-01-01
Background Statistics is relevant to students and practitioners in medicine and health sciences and is increasingly taught as part of the medical curriculum. However, it is common for students to dislike and under-perform in statistics. We sought to address these issues by redesigning the way that statistics is taught. Methods The project brought together a statistician, clinician and educational experts to re-conceptualize the syllabus, and focused on developing different methods of delivery. New teaching materials, including videos, animations and contextualized workbooks were designed and produced, placing greater emphasis on applying statistics and interpreting data. Results Two cohorts of students were evaluated, one with old style and one with new style teaching. Both were similar with respect to age, gender and previous level of statistics. Students who were taught using the new approach could better define the key concepts of p-value and confidence interval (p < 0.001 for both). They were more likely to regard statistics as integral to medical practice (p = 0.03), and to expect to use it in their medical career (p = 0.003). There was no significant difference in the numbers who thought that statistics was essential to understand the literature (p = 0.28) and those who felt comfortable with the basics of statistics (p = 0.06). More than half the students in both cohorts felt that they were comfortable with the basics of medical statistics. Conclusion Using a variety of media, and placing emphasis on interpretation can help make teaching, learning and understanding of statistics more people-centred and relevant, resulting in better outcomes for students. PMID:18452599
ERIC Educational Resources Information Center
Cunningham, Phyllis M.
Intending to explore the interaction effects of self-esteem level and perceived program utility on the retention and cognitive achievement of adult basic education students, a self-esteem instrument, to be administered verbally, was constructed with content relevant items developed from and tested on a working class, undereducated, black, adult…
ERIC Educational Resources Information Center
Tighe, Elizabeth L.; Schatschneider, Christopher
2016-01-01
The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological…
Correlation of basic TL, OSL and IRSL properties of ten K-feldspar samples of various origins
NASA Astrophysics Data System (ADS)
Sfampa, I. K.; Polymeris, G. S.; Pagonis, V.; Theodosoglou, E.; Tsirliganis, N. C.; Kitis, G.
2015-09-01
Feldspars stand among the most widely used minerals in dosimetric methods of dating using thermoluminescence (TL), optically stimulated luminescence (OSL) and infrared stimulated luminescence (IRSL). Having very good dosimetric properties, they can in principle contribute to the dating of every site of archaeological and geological interest. The present work studies basic properties of ten naturally occurring K-feldspar samples belonging to three feldspar species, namely sanidine, orthoclase and microcline. The basic properties studied are (a) the influence of blue light and infrared stimulation on the thermoluminescence glow-curves, (b) the growth of OSL, IRSL, residual TL and TL-loss as a function of OSL and IRSL bleaching time and (c) the correlation between the OSL and IRSL signals and the energy levels responsible for the TL glow-curve. All experimental data were fitted using analytical expressions derived from a recently developed tunneling recombination model. The results show that the analytical expressions provide excellent fits to all experimental results, thus verifying the tunneling recombination mechanism in these materials and providing valuable information about the concentrations of luminescence centers.
Basic functional trade-offs in cognition: An integrative framework.
Del Giudice, Marco; Crespi, Bernard J
2018-06-14
Trade-offs between advantageous but conflicting properties (e.g., speed vs. accuracy) are ubiquitous in cognition, but the relevant literature is conceptually fragmented, scattered across disciplines, and has not been organized in a coherent framework. This paper takes an initial step toward a general theory of cognitive trade-offs by examining four key properties of goal-directed systems: performance, efficiency, robustness, and flexibility. These properties define a number of basic functional trade-offs that can be used to map the abstract "design space" of natural and artificial cognitive systems. Basic functional trade-offs provide a shared vocabulary to describe a variety of specific trade-offs including speed vs. accuracy, generalist vs. specialist, exploration vs. exploitation, and many others. By linking specific features of cognitive functioning to general properties such as robustness and efficiency, it becomes possible to harness some powerful insights from systems engineering and systems biology to suggest useful generalizations, point to under-explored but potentially important trade-offs, and prompt novel hypotheses and connections between disparate areas of research. Copyright © 2018 Elsevier B.V. All rights reserved.
The AAPM/RSNA physics tutorial for residents. Basic physics of MR imaging: an introduction.
Hendrick, R E
1994-07-01
This article provides an introduction to the basic physical principles of magnetic resonance (MR) imaging. Essential basic concepts such as nuclear magnetism, tissue magnetization, precession, excitation, and tissue relaxation properties are presented. Hydrogen spin density and tissue relaxation times T1, T2, and T2* are explained. The basic elements of a planar MR pulse sequence are described: section selection during tissue excitation, phase encoding, and frequency encoding during signal measurement.
Summary Statistics of CPB-Qualified Public Radio Stations: Fiscal Year 1971.
ERIC Educational Resources Information Center
Lee, S. Young; Pedone, Ronald J.
Basic statistics on finance, employment, and broadcast and production activities of 103 Corporation for Public Broadcasting (CPB)--qualified radio stations in the United States and Puerto Rico for Fiscal Year 1971 are collected. The first section of the report deals with total funds, income, direct operating costs, capital expenditures, and other…
Using Statistics to Lie, Distort, and Abuse Data
ERIC Educational Resources Information Center
Bintz, William; Moore, Sara; Adams, Cheryll; Pierce, Rebecca
2009-01-01
Statistics is a branch of mathematics that involves organization, presentation, and interpretation of data, both quantitative and qualitative. Data do not lie, but people do. On the surface, quantitative data are basically inanimate objects, nothing more than lifeless and meaningless symbols that appear on a page, calculator, computer, or in one's…
What Software to Use in the Teaching of Mathematical Subjects?
ERIC Educational Resources Information Center
Berežný, Štefan
2015-01-01
We can consider two basic views, when using mathematical software in the teaching of mathematical subjects. First: How to learn to use specific software for the specific tasks, e. g., software Statistica for the subjects of Applied statistics, probability and mathematical statistics, or financial mathematics. Second: How to learn to use the…
Intrex Subject/Title Inverted-File Characteristics.
ERIC Educational Resources Information Center
Uemura, Syunsuke
The characteristics of the Intrex subject/title inverted file are analyzed. Basic statistics of the inverted file are presented including various distributions of the index words and terms from which the file was derived, and statistics on stems, the file growth process, and redundancy measurements. A study of stems both with extremely high and…
ERIC Educational Resources Information Center
Ramseyer, Gary C.; Tcheng, Tse-Kia
The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)
ERIC Educational Resources Information Center
Dexter, Franklin; Masursky, Danielle; Wachtel, Ruth E.; Nussmeier, Nancy A.
2010-01-01
Operating room (OR) management differs from clinical anesthesia in that statistical literacy is needed daily to make good decisions. Two of the authors teach a course in operations research for surgical services to anesthesiologists, anesthesia residents, OR nursing directors, hospital administration students, and analysts to provide them with the…
Statistics and Data Interpretation for Social Work
ERIC Educational Resources Information Center
Rosenthal, James A.
2011-01-01
Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…
Using Excel in Teacher Education for Sustainability
ERIC Educational Resources Information Center
Aydin, Serhat
2016-01-01
In this study, the feasibility of using Excel software in teaching whole Basic Statistics Course and its influence on the attitudes of pre-service science teachers towards statistics were investigated. One hundred and two pre-service science teachers in their second year participated in the study. The data were collected from the prospective…
Basic Math Skills and Performance in an Introductory Statistics Course
ERIC Educational Resources Information Center
Johnson, Marianne; Kuennen, Eric
2006-01-01
We identify the student characteristics most associated with success in an introductory business statistics class, placing special focus on the relationship between student math skills and course performance, as measured by student grade in the course. To determine which math skills are important for student success, we examine (1) whether the…
An Online Course of Business Statistics: The Proportion of Successful Students
ERIC Educational Resources Information Center
Pena-Sanchez, Rolando
2009-01-01
This article describes the students' academic progress in an online course of business statistics through interactive software assignments and diverse educational homework, which helps these students to build their own e-learning through basic competences; i.e. interpreting results and solving problems. Cross-tables were built for the categorical…
Football fever: goal distributions and non-Gaussian statistics
NASA Astrophysics Data System (ADS)
Bittner, E.; Nußbaumer, A.; Janke, W.; Weigel, M.
2009-02-01
Analyzing football score data with statistical techniques, we investigate how the not purely random, but highly co-operative nature of the game is reflected in averaged properties such as the probability distributions of scored goals for the home and away teams. As it turns out, especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. Instead, a good effective description of the data is provided by less basic distributions such as the negative binomial one or the probability densities of extreme value statistics. To understand this behavior from a microscopical point of view, however, no waiting time problem or extremal process need be invoked. Instead, modifying the Bernoulli random process underlying the Poissonian model to include a simple component of self-affirmation seems to describe the data surprisingly well and allows to understand the observed deviation from Gaussian statistics. The phenomenological distributions used before can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the “FIFA World Cup” series, and found the proposed models to be applicable rather universally. In particular, here we analyze the results of the German women’s premier football league and consider the two separate German men’s premier leagues in the East and West during the cold war times as well as the unified league after 1990 to see how scoring in football and the component of self-affirmation depend on cultural and political circumstances.
Environmental Statistics and Optimal Regulation
2014-01-01
Any organism is embedded in an environment that changes over time. The timescale for and statistics of environmental change, the precision with which the organism can detect its environment, and the costs and benefits of particular protein expression levels all will affect the suitability of different strategies–such as constitutive expression or graded response–for regulating protein levels in response to environmental inputs. We propose a general framework–here specifically applied to the enzymatic regulation of metabolism in response to changing concentrations of a basic nutrient–to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, respectively, and the costs associated with enzyme production. We use this framework to address three fundamental questions: (i) when a cell should prefer thresholding to a graded response; (ii) when there is a fitness advantage to implementing a Bayesian decision rule; and (iii) when retaining memory of the past provides a selective advantage. We specifically find that: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones. PMID:25254493
Refractories for high alkali environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rau, A.W.; Cloer, F.
1996-12-31
Information on refractories for high alkali environments is outlined. Information is presented on: product gallery; alkali attack; chemical reactions; basic layout of alkali cup test; criteria for rating alkali cup test samples; and basic layout of physical properties test.
Statistical properties of several models of fractional random point processes
NASA Astrophysics Data System (ADS)
Bendjaballah, C.
2011-08-01
Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.
Health Literacy Impact on National Healthcare Utilization and Expenditure.
Rasu, Rafia S; Bawa, Walter Agbor; Suminski, Richard; Snella, Kathleen; Warady, Bradley
2015-08-17
Health literacy presents an enormous challenge in the delivery of effective healthcare and quality outcomes. We evaluated the impact of low health literacy (LHL) on healthcare utilization and healthcare expenditure. Database analysis used Medical Expenditure Panel Survey (MEPS) from 2005-2008 which provides nationally representative estimates of healthcare utilization and expenditure. Health literacy scores (HLSs) were calculated based on a validated, predictive model and were scored according to the National Assessment of Adult Literacy (NAAL). HLS ranged from 0-500. Health literacy level (HLL) and categorized in 2 groups: Below basic or basic (HLS <226) and above basic (HLS ≥226). Healthcare utilization expressed as a physician, nonphysician, or emergency room (ER) visits and healthcare spending. Expenditures were adjusted to 2010 rates using the Consumer Price Index (CPI). A P value of 0.05 or less was the criterion for statistical significance in all analyses. Multivariate regression models assessed the impact of the predicted HLLs on outpatient healthcare utilization and expenditures. All analyses were performed with SAS and STATA® 11.0 statistical software. The study evaluated 22 599 samples representing 503 374 648 weighted individuals nationally from 2005-2008. The cohort had an average age of 49 years and included more females (57%). Caucasian were the predominant racial ethnic group (83%) and 37% of the cohort were from the South region of the United States of America. The proportion of the cohort with basic or below basic health literacy was 22.4%. Annual predicted values of physician visits, nonphysician visits, and ER visits were 6.6, 4.8, and 0.2, respectively, for basic or below basic compared to 4.4, 2.6, and 0.1 for above basic. Predicted values of office and ER visits expenditures were $1284 and $151, respectively, for basic or below basic and $719 and $100 for above basic (P < .05). The extrapolated national estimates show that the annual costs for prescription alone for adults with LHL possibly associated with basic and below basic health literacy could potentially reach about $172 billion. Health literacy is inversely associated with healthcare utilization and expenditure. Individuals with below basic or basic HLL have greater healthcare utilization and expendituresspending more on prescriptions compared to individuals with above basic HLL. Public health strategies promoting appropriate education among individuals with LHL may help to improve health outcomes and reduce unnecessary healthcare visits and costs. © 2015 by Kerman University of Medical Sciences.
Somaraj, Vinej; Shenoy, Rekha P; Panchmal, Ganesh Shenoy; Jodalli, Praveen S; Sonde, Laxminarayan; Karkal, Ravichandra
2017-01-01
This cross-sectional study aimed to assess the knowledge, attitude and anxiety pertaining to basic life support (BLS) and medical emergencies among interns in dental colleges of Mangalore city, Karnataka, India. The study subjects comprised of interns who volunteered from the four dental colleges. The knowledge and attitude of interns were assessed using a 30-item questionnaire prepared based on the Basic Life Support Manual from American Heart Association and the anxiety of interns pertaining to BLS and medical emergencies were assessed using a State-Trait Anxiety Inventory (STAI) Questionnaire. Chi-square test was performed on SPSS 21.0 (IBM Statistics, 2012) to determine statistically significant differences ( P <0.05) between assessed knowledge and anxiety. Out of 183 interns, 39.89% had below average knowledge. A total of 123 (67.21%) reported unavailability of professional training. The majority (180, 98.36%) felt the urgent need of training in basic life support procedures. Assessment of stress showed a total of 27.1% participants to be above high-stress level. Comparison of assessed knowledge and stress was found to be insignificant ( P =0.983). There was an evident lack of knowledge pertaining to the management of medical emergencies among the interns. As oral health care providers moving out to the society, a focus should be placed on the training of dental interns with respect to Basic Life Support procedures.
Dunn, Thomas M; Dalton, Alice; Dorfman, Todd; Dunn, William W
2004-01-01
To be a first step in determining whether emergency medicine technician (EMT)-Basics are capable of using a protocol that allows for selective immobilization of the cervical spine. Such protocols are coming into use at an advanced life support level and could be beneficial when used by basic life support providers. A convenience sample of participants (n=95) from 11 emergency medical services agencies and one college class participated in the study. All participants evaluated six patients in written scenarios and decided which should be placed into spinal precautions according to a selective spinal immobilization protocol. Systems without an existing selective spinal immobilization protocol received a one-hour continuing education lecture regarding the topic. College students received a similar lecture written so laypersons could understand the protocol. All participants showed proficiency when applying a selective immobilization protocol to patients in paper-based scenarios. Furthermore, EMT-Basics performed at the same level as paramedics when following the protocol. Statistical analysis revealed no significant differences between EMT-Basics and paramedics. A follow-up group of college students (added to have a non-EMS comparison group) also performed as well as paramedics when making decisions to use spinal precautions. Differences between college students and paramedics were also statistically insignificant. The results suggest that EMT-Basics are as accurate as paramedics when making decisions regarding selective immobilization of the cervical spine during paper-based scenarios. That laypersons are also proficient when using the protocol could indicate that it is extremely simple to follow. This study is a first step toward the necessary additional studies evaluating the efficacy of EMT-Basics using selective immobilization as a regular practice.
ERIC Educational Resources Information Center
Grover, Anita; Lam, Tai Ning; Hunt, C. Anthony
2008-01-01
We present a simulation tool to aid the study of basic pharmacology principles. By taking advantage of the properties of agent-based modeling, the tool facilitates taking a mechanistic approach to learning basic concepts, in contrast to the traditional empirical methods. Pharmacodynamics is a particular aspect of pharmacology that can benefit from…
Schedl, Markus
2017-01-01
Recently, the LFM-1b dataset has been proposed to foster research and evaluation in music retrieval and music recommender systems, Schedl (Proceedings of the ACM International Conference on Multimedia Retrieval (ICMR). New York, 2016). It contains more than one billion music listening events created by more than 120,000 users of Last.fm. Each listening event is characterized by artist, album, and track name, and further includes a timestamp. Basic demographic information and a selection of more elaborate listener-specific descriptors are included as well, for anonymized users. In this article, we reveal information about LFM-1b's acquisition and content and we compare it to existing datasets. We furthermore provide an extensive statistical analysis of the dataset, including basic properties of the item sets, demographic coverage, distribution of listening events (e.g., over artists and users), and aspects related to music preference and consumption behavior (e.g., temporal features and mainstreaminess of listeners). Exploiting country information of users and genre tags of artists, we also create taste profiles for populations and determine similar and dissimilar countries in terms of their populations' music preferences. Finally, we illustrate the dataset's usage in a simple artist recommendation task, whose results are intended to serve as baseline against which more elaborate techniques can be assessed.
Harmonic analysis of electric locomotive and traction power system based on wavelet singular entropy
NASA Astrophysics Data System (ADS)
Dun, Xiaohong
2018-05-01
With the rapid development of high-speed railway and heavy-haul transport, the locomotive and traction power system has become the main harmonic source of China's power grid. In response to this phenomenon, the system's power quality issues need timely monitoring, assessment and governance. Wavelet singular entropy is an organic combination of wavelet transform, singular value decomposition and information entropy theory, which combines the unique advantages of the three in signal processing: the time-frequency local characteristics of wavelet transform, singular value decomposition explores the basic modal characteristics of data, and information entropy quantifies the feature data. Based on the theory of singular value decomposition, the wavelet coefficient matrix after wavelet transform is decomposed into a series of singular values that can reflect the basic characteristics of the original coefficient matrix. Then the statistical properties of information entropy are used to analyze the uncertainty of the singular value set, so as to give a definite measurement of the complexity of the original signal. It can be said that wavelet entropy has a good application prospect in fault detection, classification and protection. The mat lab simulation shows that the use of wavelet singular entropy on the locomotive and traction power system harmonic analysis is effective.
Kim, Kiyeon; Omori, Ryosuke; Ito, Kimihito
2017-12-01
The estimation of the basic reproduction number is essential to understand epidemic dynamics, and time series data of infected individuals are usually used for the estimation. However, such data are not always available. Methods to estimate the basic reproduction number using genealogy constructed from nucleotide sequences of pathogens have been proposed so far. Here, we propose a new method to estimate epidemiological parameters of outbreaks using the time series change of Tajima's D statistic on the nucleotide sequences of pathogens. To relate the time evolution of Tajima's D to the number of infected individuals, we constructed a parsimonious mathematical model describing both the transmission process of pathogens among hosts and the evolutionary process of the pathogens. As a case study we applied this method to the field data of nucleotide sequences of pandemic influenza A (H1N1) 2009 viruses collected in Argentina. The Tajima's D-based method estimated basic reproduction number to be 1.55 with 95% highest posterior density (HPD) between 1.31 and 2.05, and the date of epidemic peak to be 10th July with 95% HPD between 22nd June and 9th August. The estimated basic reproduction number was consistent with estimation by birth-death skyline plot and estimation using the time series of the number of infected individuals. These results suggested that Tajima's D statistic on nucleotide sequences of pathogens could be useful to estimate epidemiological parameters of outbreaks. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Boiteau, Denise; Stansfield, David
This document describes mathematical programs on the basic concepts of algebra produced by Louisiana Public Broadcasting. Programs included are: (1) "Inverse Operations"; (2) "The Order of Operations"; (3) "Basic Properties" (addition and multiplication of numbers and variables); (4) "The Positive and Negative…
The Government Giveth and the Government Taketh Away: Federal Tax Law and Fund Raising.
ERIC Educational Resources Information Center
Holzman, Donald J.
1982-01-01
Tax laws' incentives and disincentives for charitable giving are outlined. Basics of charitable giving, partial property interests, gifts of future interest in tangible property, undivided interest gifts, ordinary income property, capital gain property, bargain sales, remainder interest gifts, estate tax, and valuation overstatement are discussed…
Factors of soil diversity in the Batumi delta (Georgia)
NASA Astrophysics Data System (ADS)
Turgut, Bülent; Ateş, Merve
2017-01-01
The aim of this study was to determine certain basic properties of soils in the Batumi delta (southwestern Georgia) to determine the relationships of studied properties and to identify differences with regards to these properties between different sampling sites in the delta that were selected based on the delta morphology. In this context, a total of 125 soil samples were collected from five different sampling sites, and the clay, silt and sand content of the samples were determined along with their mean weight diameter (MWD) values, aggregate stability (AS) values, amount of water retained under -33 (FC) and -1500 kPa (WP) pressure and organic matter (OM) content. Correlation analysis indicated that clay content and OM were positively correlated with MWD, and OM was positively correlated with AS. However, the sand content was found to be negatively correlated with MWD. In addition, clay, silt and OM content were positive correlated with FC and WP. Variance analysis results determined statistically significant differences between the sampling sites with respect to all of the evaluated properties. The active delta section of the study area was characterized by high sand content, while the lower delta plain was characterized by high OM and AS values, and the upper delta plain was characterized by high MWD values, high FC and WP moisture content levels and high clay and silt content. In conclusion, it was demonstrated that the examined properties were significantly affected by the different morphological positions and usages of these different areas. These results may help with the management of agricultural lands in the Batumi delta, which has never been studied before.
ERIC Educational Resources Information Center
Reason, Paul L.; Tankard, George G., Jr.
This handbook serves as a basic guide to property accounting for local and state school systems in the U.S. Information and guidelines are presented regarding--(1) classification of property accounts, (2) definitions of property accounts, (3) measures of school property, (4) supplies and equipment, (5) individual property records, and (6) summary…
Emergence of good conduct, scaling and zipf laws in human behavioral sequences in an online world.
Thurner, Stefan; Szell, Michael; Sinatra, Roberta
2012-01-01
We study behavioral action sequences of players in a massive multiplayer online game. In their virtual life players use eight basic actions which allow them to interact with each other. These actions are communication, trade, establishing or breaking friendships and enmities, attack, and punishment. We measure the probabilities for these actions conditional on previous taken and received actions and find a dramatic increase of negative behavior immediately after receiving negative actions. Similarly, positive behavior is intensified by receiving positive actions. We observe a tendency towards antipersistence in communication sequences. Classifying actions as positive (good) and negative (bad) allows us to define binary 'world lines' of lives of individuals. Positive and negative actions are persistent and occur in clusters, indicated by large scaling exponents α ~ 0.87 of the mean square displacement of the world lines. For all eight action types we find strong signs for high levels of repetitiveness, especially for negative actions. We partition behavioral sequences into segments of length n (behavioral 'words' and 'motifs') and study their statistical properties. We find two approximate power laws in the word ranking distribution, one with an exponent of κ ~ -1 for the ranks up to 100, and another with a lower exponent for higher ranks. The Shannon n-tuple redundancy yields large values and increases in terms of word length, further underscoring the non-trivial statistical properties of behavioral sequences. On the collective, societal level the timeseries of particular actions per day can be understood by a simple mean-reverting log-normal model.
Near Earth Asteroid Characterization for Threat Assessment
NASA Technical Reports Server (NTRS)
Dotson, Jessie; Mathias, Donovan; Wheeler, Lorien; Wooden, Diane; Bryson, Kathryn; Ostrowski, Daniel
2017-01-01
Physical characteristics of NEAs are an essential input to modeling behavior during atmospheric entry and to assess the risk of impact but determining these properties requires a non-trivial investment of time and resources. The characteristics relevant to these models include size, density, strength and ablation coefficient. Some of these characteristics cannot be directly measured, but rather must be inferred from related measurements of asteroids and/or meteorites. Furthermore, for the majority of NEAs, only the basic measurements exist so often properties must be inferred from statistics of the population of more completely characterized objects. The Asteroid Threat Assessment Project at NASA Ames Research Center has developed a probabilistic asteroid impact risk (PAIR) model in order to assess the risk of asteroid impact. Our PAIR model and its use to develop probability distributions of impact risk are discussed in other contributions to PDC 2017 (e.g., Mathias et al.). Here we utilize PAIR to investigate which NEA characteristics are important for assessing the impact threat by investigating how changes in these characteristics alter the damage predicted by PAIR. We will also provide an assessment of the current state of knowledge of the NEA characteristics of importance for asteroid threat assessment. The relative importance of different properties as identified using PAIR will be combined with our assessment of the current state of knowledge to identify potential high impact investigations. In addition, we will discuss an ongoing effort to collate the existing measurements of NEA properties of interest to the planetary defense community into a readily accessible database.
ERIC Educational Resources Information Center
Rubinson, Laura E.
2010-01-01
More than one third of American children cannot read at a basic level by fourth grade (Lee, Grigg, & Donahue, 2007) and those numbers are even higher for African American, Hispanic and poor White students (Boorman et al., 2007). These are alarming statistics given that the ability to read is the most basic and fundamental skill for academic…
ERIC Educational Resources Information Center
Chukwu, Leo C.; Eze, Thecla A. Y.; Agada, Fidelia Chinyelugo
2016-01-01
The study examined the availability of instructional materials at the basic education level in Enugu Education Zone of Enugu State, Nigeria. One research question and one hypothesis guided the study. The research question was answered using mean and grand mean ratings, while the hypothesis was tested using t-test statistics at 0.05 level of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mertens, Susanne
In this contribution the current status and future perspectives of the Karlsruhe Tritium Neutrino (KATRIN) Experiment are presented. The prime goal of this single β-decay experiment is to probe the absolute neutrino mass scale with a sensitivity of 200 meV (90% CL). We discuss first results of the recent main spectrometer commissioning measurements, successfully verifying the spectrometer’s basic vacuum, transmission and background properties. We also discuss the prospects of making use of the KATRIN tritium source, to search for sterile neutrinos in the multi-keV mass range constituting a classical candidate for Warm Dark Matter. Due to the very high sourcemore » luminosity, a statistical sensitivity down to active-sterile mixing angles of sin² θ < 1 · 10⁻⁷ (90% CL) could be reached.« less
Local quantum thermal susceptibility
De Pasquale, Antonella; Rossini, Davide; Fazio, Rosario; Giovannetti, Vittorio
2016-01-01
Thermodynamics relies on the possibility to describe systems composed of a large number of constituents in terms of few macroscopic variables. Its foundations are rooted into the paradigm of statistical mechanics, where thermal properties originate from averaging procedures which smoothen out local details. While undoubtedly successful, elegant and formally correct, this approach carries over an operational problem, namely determining the precision at which such variables are inferred, when technical/practical limitations restrict our capabilities to local probing. Here we introduce the local quantum thermal susceptibility, a quantifier for the best achievable accuracy for temperature estimation via local measurements. Our method relies on basic concepts of quantum estimation theory, providing an operative strategy to address the local thermal response of arbitrary quantum systems at equilibrium. At low temperatures, it highlights the local distinguishability of the ground state from the excited sub-manifolds, thus providing a method to locate quantum phase transitions. PMID:27681458
Mertens, Susanne
2015-03-24
In this contribution the current status and future perspectives of the Karlsruhe Tritium Neutrino (KATRIN) Experiment are presented. The prime goal of this single β-decay experiment is to probe the absolute neutrino mass scale with a sensitivity of 200 meV (90% CL). We discuss first results of the recent main spectrometer commissioning measurements, successfully verifying the spectrometer’s basic vacuum, transmission and background properties. We also discuss the prospects of making use of the KATRIN tritium source, to search for sterile neutrinos in the multi-keV mass range constituting a classical candidate for Warm Dark Matter. Due to the very high sourcemore » luminosity, a statistical sensitivity down to active-sterile mixing angles of sin² θ < 1 · 10⁻⁷ (90% CL) could be reached.« less
Local quantum thermal susceptibility
NASA Astrophysics Data System (ADS)
de Pasquale, Antonella; Rossini, Davide; Fazio, Rosario; Giovannetti, Vittorio
2016-09-01
Thermodynamics relies on the possibility to describe systems composed of a large number of constituents in terms of few macroscopic variables. Its foundations are rooted into the paradigm of statistical mechanics, where thermal properties originate from averaging procedures which smoothen out local details. While undoubtedly successful, elegant and formally correct, this approach carries over an operational problem, namely determining the precision at which such variables are inferred, when technical/practical limitations restrict our capabilities to local probing. Here we introduce the local quantum thermal susceptibility, a quantifier for the best achievable accuracy for temperature estimation via local measurements. Our method relies on basic concepts of quantum estimation theory, providing an operative strategy to address the local thermal response of arbitrary quantum systems at equilibrium. At low temperatures, it highlights the local distinguishability of the ground state from the excited sub-manifolds, thus providing a method to locate quantum phase transitions.
Web based aphasia test using service oriented architecture (SOA)
NASA Astrophysics Data System (ADS)
Voos, J. A.; Vigliecca, N. S.; Gonzalez, E. A.
2007-11-01
Based on an aphasia test for Spanish speakers which analyze the patient's basic resources of verbal communication, a web-enabled software was developed to automate its execution. A clinical database was designed as a complement, in order to evaluate the antecedents (risk factors, pharmacological and medical backgrounds, neurological or psychiatric symptoms, brain injury -anatomical and physiological characteristics, etc) which are necessary to carry out a multi-factor statistical analysis in different samples of patients. The automated test was developed following service oriented architecture and implemented in a web site which contains a tests suite, which would allow both integrating the aphasia test with other neuropsychological instruments and increasing the available site information for scientific research. The test design, the database and the study of its psychometric properties (validity, reliability and objectivity) were made in conjunction with neuropsychological researchers, who participate actively in the software design, based on the patients or other subjects of investigation feedback.
Zhang, Ruixun; Brennan, Thomas J.; Lo, Andrew W.
2014-01-01
Risk aversion is one of the most basic assumptions of economic behavior, but few studies have addressed the question of where risk preferences come from and why they differ from one individual to the next. Here, we propose an evolutionary explanation for the origin of risk aversion. In the context of a simple binary-choice model, we show that risk aversion emerges by natural selection if reproductive risk is systematic (i.e., correlated across individuals in a given generation). In contrast, risk neutrality emerges if reproductive risk is idiosyncratic (i.e., uncorrelated across each given generation). More generally, our framework implies that the degree of risk aversion is determined by the stochastic nature of reproductive rates, and we show that different statistical properties lead to different utility functions. The simplicity and generality of our model suggest that these implications are primitive and cut across species, physiology, and genetic origins. PMID:25453072
Li, Chi-Lin; Lu, Chia-Jung
2009-08-15
Linear solvation energy relationships (LSERs) have been recognized as a useful model for investigating the chemical forces behind the partition coefficients between vapor molecules and absorbents. This study is the first to determine the solvation properties of monolayer-protected gold nanoclusters (MPCs) with different surface ligands. The ratio of partition coefficients/MPC density (K/rho) of 18 volatile organic compounds (VOCs) for four different MPCs obtained through quartz crystal microbalance (QCM) experiments were used for the LSER model calculations. LSER modeling results indicate that all MPC surfaces showed a statistically significant (p<0.05) preference to hydrogen-bond acidic molecules. Through dipole-dipole attraction, 4-methoxythiophenol-capped MPCs can also interact with polar organics (s=1.04). Showing a unique preference for the hydrogen bond basicity of vapors (b=1.11), 2-benzothiazolethiol-capped MPCs provide evidence of an intra-molecular, proton-shift mechanism on surface of nano-gold.
Zhang, Ruixun; Brennan, Thomas J; Lo, Andrew W
2014-12-16
Risk aversion is one of the most basic assumptions of economic behavior, but few studies have addressed the question of where risk preferences come from and why they differ from one individual to the next. Here, we propose an evolutionary explanation for the origin of risk aversion. In the context of a simple binary-choice model, we show that risk aversion emerges by natural selection if reproductive risk is systematic (i.e., correlated across individuals in a given generation). In contrast, risk neutrality emerges if reproductive risk is idiosyncratic (i.e., uncorrelated across each given generation). More generally, our framework implies that the degree of risk aversion is determined by the stochastic nature of reproductive rates, and we show that different statistical properties lead to different utility functions. The simplicity and generality of our model suggest that these implications are primitive and cut across species, physiology, and genetic origins.
Basic statistics with Microsoft Excel: a review.
Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-06-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.
Basic statistics with Microsoft Excel: a review
Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-01-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690
A. C. C. Fact Book: A Statistical Profile of Allegany Community College and the Community It Serves.
ERIC Educational Resources Information Center
Andersen, Roger C.
This document is intended to be an authoritative compilation of frequently referenced basic facts concerning Allegany Community College (ACC) in Maryland. It is a statistical profile of ACC and the community it serves, divided into six sections: enrollment, students, faculty, community, support services, and general college related information.…
Basic Mathematics Test Predicts Statistics Achievement and Overall First Year Academic Success
ERIC Educational Resources Information Center
Fonteyne, Lot; De Fruyt, Filip; Dewulf, Nele; Duyck, Wouter; Erauw, Kris; Goeminne, Katy; Lammertyn, Jan; Marchant, Thierry; Moerkerke, Beatrijs; Oosterlinck, Tom; Rosseel, Yves
2015-01-01
In the psychology and educational science programs at Ghent University, only 36.1% of the new incoming students in 2011 and 2012 passed all exams. Despite availability of information, many students underestimate the scientific character of social science programs. Statistics courses are a major obstacle in this matter. Not all enrolling students…
ERIC Educational Resources Information Center
Schweizer, Karl; Steinwascher, Merle; Moosbrugger, Helfried; Reiss, Siegbert
2011-01-01
The development of research methodology competency is a major aim of the psychology curriculum at universities. Usually, three courses concentrating on basic statistics, advanced statistics and experimental methods, respectively, serve the achievement of this aim. However, this traditional curriculum-based course structure gives rise to the…
ERIC Educational Resources Information Center
Maric, Marija; Wiers, Reinout W.; Prins, Pier J. M.
2012-01-01
Despite guidelines and repeated calls from the literature, statistical mediation analysis in youth treatment outcome research is rare. Even more concerning is that many studies that "have" reported mediation analyses do not fulfill basic requirements for mediation analysis, providing inconclusive data and clinical implications. As a result, after…
Statistical estimators for monitoring spotted owls in Oregon and Washington in 1987.
Tlmothy A. Max; Ray A. Souter; Kathleen A. O' Halloran
1990-01-01
Spotted owls (Strix occidentalis) were monitored on 11 National Forests in the Pacific Northwest Region of the USDA Forest Service between March and August of 1987. The basic intent of monitoring was to provide estimates of occupancy and reproduction rates for pairs of spotted owls. This paper documents the technical details of the statistical...
Statistical techniques for sampling and monitoring natural resources
Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado
2004-01-01
We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....
Peer-Assisted Learning in Research Methods and Statistics
ERIC Educational Resources Information Center
Stone, Anna; Meade, Claire; Watling, Rosamond
2012-01-01
Feedback from students on a Level 1 Research Methods and Statistics module, studied as a core part of a BSc Psychology programme, highlighted demand for additional tutorials to help them to understand basic concepts. Students in their final year of study commonly request work experience to enhance their employability. All students on the Level 1…
Adult Basic and Secondary Education Program Statistics. Fiscal Year 1976.
ERIC Educational Resources Information Center
Cain, Sylvester H.; Whalen, Barbara A.
Reports submitted to the National Center for Education Statistics provided data for this compilation and tabulation of data on adult participants in U.S. educational programs in fiscal year 1976. In the summary section introducing the charts, it is noted that adult education programs funded under P.L. 91-230 served over 1.6 million persons--an…
ERIC Educational Resources Information Center
Goodman, Leroy V., Ed.
This is the third edition of the Education Almanac, an assemblage of statistics, facts, commentary, and basic background information about the conduct of schools in the United States. Features of this variegated volume include an introductory section on "Education's Newsiest Developments," followed by some vital educational statistics, a set of…
Theory of Financial Risk and Derivative Pricing
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2009-01-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Theory of Financial Risk and Derivative Pricing - 2nd Edition
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2003-12-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
NASA Astrophysics Data System (ADS)
Rubtsov, Vladimir; Kapralov, Sergey; Chalyk, Iuri; Ulianova, Onega; Ulyanov, Sergey
2013-02-01
Statistical properties of laser speckles, formed in skin and mucous of colon have been analyzed and compared. It has been demonstrated that first and second order statistics of "skin" speckles and "mucous" speckles are quite different. It is shown that speckles, formed in mucous, are not Gaussian one. Layered structure of colon mucous causes formation of speckled biospeckles. First- and second- order statistics of speckled speckles have been reviewed in this paper. Statistical properties of Fresnel and Fraunhofer doubly scattered and cascade speckles are described. Non-gaussian statistics of biospeckles may lead to high localization of intensity of coherent light in human tissue during the laser surgery. Way of suppression of highly localized non-gaussian speckles is suggested.
CORSSA: Community Online Resource for Statistical Seismicity Analysis
NASA Astrophysics Data System (ADS)
Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.
2011-12-01
Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.
Predicting Success in Psychological Statistics Courses.
Lester, David
2016-06-01
Many students perform poorly in courses on psychological statistics, and it is useful to be able to predict which students will have difficulties. In a study of 93 undergraduates enrolled in Statistical Methods (18 men, 75 women; M age = 22.0 years, SD = 5.1), performance was significantly associated with sex (female students performed better) and proficiency in algebra in a linear regression analysis. Anxiety about statistics was not associated with course performance, indicating that basic mathematical skills are the best correlate for performance in statistics courses and can usefully be used to stream students into classes by ability. © The Author(s) 2016.
Patino, Robert
2009-03-01
Clinical and basic scientists at academic medical and biomedical research institutions often form ideas that could have both monetary and human health benefits if developed and applied to improvement of human wellbeing. However, such ideas lose much of their potential value in both regards if they are disclosed in traditional knowledge-sharing forums such as abstracts, posters, and oral presentations at research meetings. Learning the basics about intellectual property protection and obtaining professional guidance in the management of intellectual property from a knowledgeable technology management professional or intellectual property attorney can avoid such losses yet pose a minimal burden of confidentiality on the investigator. Knowing how to successfully navigate the early stages of intellectual property protection can greatly increase the likelihood that discoveries and knowledge will become available for the public good without diminishing the important mandate of disseminating knowledge through traditional knowledge-sharing forums.
Speech enhancement using the modified phase-opponency model.
Deshmukh, Om D; Espy-Wilson, Carol Y; Carney, Laurel H
2007-06-01
In this paper we present a model called the Modified Phase-Opponency (MPO) model for single-channel speech enhancement when the speech is corrupted by additive noise. The MPO model is based on the auditory PO model, proposed for detection of tones in noise. The PO model includes a physiologically realistic mechanism for processing the information in neural discharge times and exploits the frequency-dependent phase properties of the tuned filters in the auditory periphery by using a cross-auditory-nerve-fiber coincidence detection for extracting temporal cues. The MPO model alters the components of the PO model such that the basic functionality of the PO model is maintained but the properties of the model can be analyzed and modified independently. The MPO-based speech enhancement scheme does not need to estimate the noise characteristics nor does it assume that the noise satisfies any statistical model. The MPO technique leads to the lowest value of the LPC-based objective measures and the highest value of the perceptual evaluation of speech quality measure compared to other methods when the speech signals are corrupted by fluctuating noise. Combining the MPO speech enhancement technique with our aperiodicity, periodicity, and pitch detector further improves its performance.
Identification of elastic, dielectric, and piezoelectric constants in piezoceramic disks.
Perez, Nicolas; Andrade, Marco A B; Buiochi, Flavio; Adamowski, Julio C
2010-12-01
Three-dimensional modeling of piezoelectric devices requires a precise knowledge of piezoelectric material parameters. The commonly used piezoelectric materials belong to the 6mm symmetry class, which have ten independent constants. In this work, a methodology to obtain precise material constants over a wide frequency band through finite element analysis of a piezoceramic disk is presented. Given an experimental electrical impedance curve and a first estimate for the piezoelectric material properties, the objective is to find the material properties that minimize the difference between the electrical impedance calculated by the finite element method and that obtained experimentally by an electrical impedance analyzer. The methodology consists of four basic steps: experimental measurement, identification of vibration modes and their sensitivity to material constants, a preliminary identification algorithm, and final refinement of the material constants using an optimization algorithm. The application of the methodology is exemplified using a hard lead zirconate titanate piezoceramic. The same methodology is applied to a soft piezoceramic. The errors in the identification of each parameter are statistically estimated in both cases, and are less than 0.6% for elastic constants, and less than 6.3% for dielectric and piezoelectric constants.
The ISO View of Star Forming Galaxies
NASA Astrophysics Data System (ADS)
Helou, George
1999-01-01
ISO studies of normal galaxies in the local Universe have revealed basic new properties whose significant implications for the star formation process and cosmology are only starting to be understood. This review will touch on the general results of a statistical nature, and provide a quick summary of the profusion of exciting results on individual objects. In the mid-infrared, PHT-S has established that the spectra of star forming galaxies between 6 and-13microns are dominated by the Aromatic Features in Emission (AFE), and show little variation as a function of the heating intensity. The Carriers of the AFE (CAFE) are thus a universal component of dust with standard properties, and contribute between 10 and 25% of the total dust luminosity. In addition to AFE, the spectra show a low-level continuum detectable at wavelengths longer than 3.5microns whose origin is still under investigation. The mid-infrared colors formed as the ratio of flux densities in the 6.75micron and the 15micron bands of ISO-CAM remain essentially constant and near unity for quiescent and mildly active galaxies. As dust heating increases further, the 15micron flux increases steeply compared to 6.75microns, indicating that dust heated to 100K
Dinis, L.; Petrov, D.; Parrondo, J. M. R.; Rica, R. A.
2016-01-01
The Carnot cycle imposes a fundamental upper limit to the efficiency of a macroscopic motor operating between two thermal baths1. However, this bound needs to be reinterpreted at microscopic scales, where molecular bio-motors2 and some artificial micro-engines3–5 operate. As described by stochastic thermodynamics6,7, energy transfers in microscopic systems are random and thermal fluctuations induce transient decreases of entropy, allowing for possible violations of the Carnot limit8. Here we report an experimental realization of a Carnot engine with a single optically trapped Brownian particle as the working substance. We present an exhaustive study of the energetics of the engine and analyse the fluctuations of the finite-time efficiency, showing that the Carnot bound can be surpassed for a small number of non-equilibrium cycles. As its macroscopic counterpart, the energetics of our Carnot device exhibits basic properties that one would expect to observe in any microscopic energy transducer operating with baths at different temperatures9–11. Our results characterize the sources of irreversibility in the engine and the statistical properties of the efficiency—an insight that could inspire new strategies in the design of efficient nano-motors. PMID:27330541
... Surveillance References Birth Defects COUNT Data & Statistics Research Articles & Key Findings About Us Partners Links to Other Websites Information For… Media Policy Makers Folic Acid Basics Language: English (US) ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prima, Eka Cahya; Computational Material Design and Quantum Engineering Laboratory, Engineering Physics, Institut Teknologi Bandung; International Program on Science Education, Universitas Pendidikan Indonesia
2015-09-30
The aglycones of anthocyanidin dyes were previously reported to form carbinol pseudobase, cis-chalcone, and trans-chalcone due to the basic levels. The further investigations of ground and excited state properties of the dyes were characterized using density functional theory with PCM(UFF)/B3LYP/6-31+G(d,p) level in the basic solutions. However, to the best of our knowledge, the theoretical investigation of their potential photosensitizers has never been reported before. In this paper, the theoretical photovoltaic properties sensitized by dyes have been successfully investigated including the electron injections, the ground and excited state oxidation potentials, the estimated open circuit voltages, and the light harvesting efficiencies. Themore » results prove that the electronic properties represented by dyes’ LUMO-HOMO levels will affect to the photovoltaic performances. Cis-chalcone dye is the best anthocyanidin aglycone dye with the electron injection spontaneity of −1.208 eV, the theoretical open circuit voltage of 1.781 V, and light harvesting efficiency of 56.55% due to the best HOMO-LUMO levels. Moreover, the ethanol solvent slightly contributes to the better cell performance than the water solvent dye because of the better oxidation potential stabilization in the ground state as well as in the excited state. These results are in good agreement with the known experimental report that the aglycones of anthocyanidin dyes in basic solvent are the high potential photosensitizers for dye-sensitized solar cell.« less
NASA Astrophysics Data System (ADS)
Wright, Robyn; Thornberg, Steven M.
SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.
[Comment on] Statistical discrimination
NASA Astrophysics Data System (ADS)
Chinn, Douglas
In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.
Development and analysis of a meteorological database, Argonne National Laboratory, Illinois
Over, Thomas M.; Price, Thomas H.; Ishii, Audrey L.
2010-01-01
A database of hourly values of air temperature, dewpoint temperature, wind speed, and solar radiation from January 1, 1948, to September 30, 2003, primarily using data collected at the Argonne National Laboratory station, was developed for use in continuous-time hydrologic modeling in northeastern Illinois. Missing and apparently erroneous data values were replaced with adjusted values from nearby stations used as 'backup'. Temporal variations in the statistical properties of the data resulting from changes in measurement and data-storage methodologies were adjusted to match the statistical properties resulting from the data-collection procedures that have been in place since January 1, 1989. The adjustments were computed based on the regressions between the primary data series from Argonne National Laboratory and the backup series using data obtained during common periods; the statistical properties of the regressions were used to assign estimated standard errors to values that were adjusted or filled from other series. Each hourly value was assigned a corresponding data-source flag that indicates the source of the value and its transformations. An analysis of the data-source flags indicates that all the series in the database except dewpoint have a similar fraction of Argonne National Laboratory data, with about 89 percent for the entire period, about 86 percent from 1949 through 1988, and about 98 percent from 1989 through 2003. The dewpoint series, for which observations at Argonne National Laboratory did not begin until 1958, has only about 71 percent Argonne National Laboratory data for the entire period, about 63 percent from 1948 through 1988, and about 93 percent from 1989 through 2003, indicating a lower reliability of the dewpoint sensor. A basic statistical analysis of the filled and adjusted data series in the database, and a series of potential evapotranspiration computed from them using the computer program LXPET (Lamoreux Potential Evapotranspiration) also was carried out. This analysis indicates annual cycles in solar radiation and potential evapotranspiration that follow the annual cycle of extraterrestrial solar radiation, whereas temperature and dewpoint annual cycles are lagged by about 1 month relative to the solar cycle. The annual cycle of wind has a late summer minimum, and spring and fall maximums. At the annual time scale, the filled and adjusted data series and computed potential evapotranspiration have significant serial correlation and possibly have significant temporal trends. The inter-annual fluctuations of temperature and dewpoint are weakest, whereas those of wind and potential evapotranspiration are strongest.
36 CFR 902.12 - Maintenance of statistics; annual report to Congress.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Maintenance of statistics; annual report to Congress. 902.12 Section 902.12 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT General Administration § 902.12 Maintenance of statistics...
On the quantum Landau collision operator and electron collisions in dense plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daligault, Jérôme, E-mail: daligaul@lanl.gov
2016-03-15
The quantum Landau collision operator, which extends the widely used Landau/Fokker-Planck collision operator to include quantum statistical effects, is discussed. The quantum extension can serve as a reference model for including electron collisions in non-equilibrium dense plasmas, in which the quantum nature of electrons cannot be neglected. In this paper, the properties of the Landau collision operator that have been useful in traditional plasma kinetic theory and plasma transport theory are extended to the quantum case. We outline basic properties in connection with the conservation laws, the H-theorem, and the global and local equilibrium distributions. We discuss the Fokker-Planck formmore » of the operator in terms of three potentials that extend the usual two Rosenbluth potentials. We establish practical closed-form expressions for these potentials under local thermal equilibrium conditions in terms of Fermi-Dirac and Bose-Einstein integrals. We study the properties of linearized quantum Landau operator, and extend two popular approximations used in plasma physics to include collisions in kinetic simulations. We apply the quantum Landau operator to the classic test-particle problem to illustrate the physical effects embodied in the quantum extension. We present useful closed-form expressions for the electron-ion momentum and energy transfer rates. Throughout the paper, similarities and differences between the quantum and classical Landau collision operators are emphasized.« less
New contributions to granite characterization by ultrasonic testing.
Cerrillo, C; Jiménez, A; Rufo, M; Paniagua, J; Pachón, F T
2014-01-01
Ultrasound evaluation permits the state of rocks to be determined quickly and cheaply, satisfying the demands faced by today's producers of ornamental stone, such as environmental sustainability, durability and safety of use. The basic objective of the present work is to analyse and develop the usefulness of ultrasound testing in estimating the physico-mechanical properties of granite. Various parameters related to Fast Fourier Transform (FFTs) and attenuation have been extracted from some of the studies conducted (parameters which have not previously been considered in work on this topic, unlike the ultrasonic pulse velocity). The experimental study was carried out on cubic specimens of 30 cm edges using longitudinal and shear wave transducers and equipment which extended the normally used natural resonance frequency range up to 500 kHz. Additionally, a validation study of the laboratory data has been conducted and some methodological improvements have been implemented. The main contribution of the work is the analysis of linear statistical correlations between the aforementioned new ultrasound parameters and physico-mechanical properties of the granites that had not previously been studied, i.e., resistance to salt crystallization and breaking load for anchors. Being properties that directly affect the durability and safety of use of granites, these correlations consolidate ultrasonics as a nondestructive method well suited to this type of material. Copyright © 2013 Elsevier B.V. All rights reserved.
On the quantum Landau collision operator and electron collisions in dense plasmas
NASA Astrophysics Data System (ADS)
Daligault, Jérôme
2016-03-01
The quantum Landau collision operator, which extends the widely used Landau/Fokker-Planck collision operator to include quantum statistical effects, is discussed. The quantum extension can serve as a reference model for including electron collisions in non-equilibrium dense plasmas, in which the quantum nature of electrons cannot be neglected. In this paper, the properties of the Landau collision operator that have been useful in traditional plasma kinetic theory and plasma transport theory are extended to the quantum case. We outline basic properties in connection with the conservation laws, the H-theorem, and the global and local equilibrium distributions. We discuss the Fokker-Planck form of the operator in terms of three potentials that extend the usual two Rosenbluth potentials. We establish practical closed-form expressions for these potentials under local thermal equilibrium conditions in terms of Fermi-Dirac and Bose-Einstein integrals. We study the properties of linearized quantum Landau operator, and extend two popular approximations used in plasma physics to include collisions in kinetic simulations. We apply the quantum Landau operator to the classic test-particle problem to illustrate the physical effects embodied in the quantum extension. We present useful closed-form expressions for the electron-ion momentum and energy transfer rates. Throughout the paper, similarities and differences between the quantum and classical Landau collision operators are emphasized.
Calculations of critical misfit and thickness: An overview
NASA Technical Reports Server (NTRS)
Vandermerwe, Jan H.; Jesser, W. A.
1988-01-01
This overview stresses the equilibrium/nonequilibrium nature of the physical properties, as well as the basic properties of the models, used to calculate critical misfit and critical thickness in epitaxy.
1987-08-01
HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band
ERIC Educational Resources Information Center
Taylor, Marjorie; And Others
Anodizing, Inc., Teamsters Local 162, and Mt. Hood Community College (Oregon) developed a workplace literacy program for workers at Anodizing. These workers did not have the basic skill competencies to benefit from company training efforts in statistical process control and quality assurance and were not able to advance to lead and supervisory…
ERIC Educational Resources Information Center
Vizenor, Gerald
Opportunities Unlimited is a State-wide program to provide adult basic education (ABE) and training for Indians on Minnesota reservations and in Indian communities. An administrative center in Bemidji serves communities on the Red Lake, White Earth, and Leech Lake Reservations, and a Duluth center provides ABE and training for communities on the…
A quantitative comparison of corrective and perfective maintenance
NASA Technical Reports Server (NTRS)
Henry, Joel; Cain, James
1994-01-01
This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.
ERIC Educational Resources Information Center
Joireman, Jeff; Abbott, Martin L.
This report examines the overlap between student test results on the Iowa Test of Basic Skills (ITBS) and the Washington Assessment of Student Learning (WASL). The two tests were compared and contrasted in terms of content and measurement philosophy, and analyses studied the statistical relationship between the ITBS and the WASL. The ITBS assesses…
Selection vector filter framework
NASA Astrophysics Data System (ADS)
Lukac, Rastislav; Plataniotis, Konstantinos N.; Smolka, Bogdan; Venetsanopoulos, Anastasios N.
2003-10-01
We provide a unified framework of nonlinear vector techniques outputting the lowest ranked vector. The proposed framework constitutes a generalized filter class for multichannel signal processing. A new class of nonlinear selection filters are based on the robust order-statistic theory and the minimization of the weighted distance function to other input samples. The proposed method can be designed to perform a variety of filtering operations including previously developed filtering techniques such as vector median, basic vector directional filter, directional distance filter, weighted vector median filters and weighted directional filters. A wide range of filtering operations is guaranteed by the filter structure with two independent weight vectors for angular and distance domains of the vector space. In order to adapt the filter parameters to varying signal and noise statistics, we provide also the generalized optimization algorithms taking the advantage of the weighted median filters and the relationship between standard median filter and vector median filter. Thus, we can deal with both statistical and deterministic aspects of the filter design process. It will be shown that the proposed method holds the required properties such as the capability of modelling the underlying system in the application at hand, the robustness with respect to errors in the model of underlying system, the availability of the training procedure and finally, the simplicity of filter representation, analysis, design and implementation. Simulation studies also indicate that the new filters are computationally attractive and have excellent performance in environments corrupted by bit errors and impulsive noise.
Cooper, Emily A.; Norcia, Anthony M.
2015-01-01
The nervous system has evolved in an environment with structure and predictability. One of the ubiquitous principles of sensory systems is the creation of circuits that capitalize on this predictability. Previous work has identified predictable non-uniformities in the distributions of basic visual features in natural images that are relevant to the encoding tasks of the visual system. Here, we report that the well-established statistical distributions of visual features -- such as visual contrast, spatial scale, and depth -- differ between bright and dark image components. Following this analysis, we go on to trace how these differences in natural images translate into different patterns of cortical input that arise from the separate bright (ON) and dark (OFF) pathways originating in the retina. We use models of these early visual pathways to transform natural images into statistical patterns of cortical input. The models include the receptive fields and non-linear response properties of the magnocellular (M) and parvocellular (P) pathways, with their ON and OFF pathway divisions. The results indicate that there are regularities in visual cortical input beyond those that have previously been appreciated from the direct analysis of natural images. In particular, several dark/bright asymmetries provide a potential account for recently discovered asymmetries in how the brain processes visual features, such as violations of classic energy-type models. On the basis of our analysis, we expect that the dark/bright dichotomy in natural images plays a key role in the generation of both cortical and perceptual asymmetries. PMID:26020624
Fundamentals in Biostatistics for Research in Pediatric Dentistry: Part I - Basic Concepts.
Garrocho-Rangel, J A; Ruiz-Rodríguez, M S; Pozos-Guillén, A J
The purpose of this report was to provide the reader with some basic concepts in order to better understand the significance and reliability of the results of any article on Pediatric Dentistry. Currently, Pediatric Dentists need the best evidence available in the literature on which to base their diagnoses and treatment decisions for the children's oral care. Basic understanding of Biostatistics plays an important role during the entire Evidence-Based Dentistry (EBD) process. This report describes Biostatistics fundamentals in order to introduce the basic concepts used in statistics, such as summary measures, estimation, hypothesis testing, effect size, level of significance, p value, confidence intervals, etc., which are available to Pediatric Dentists interested in reading or designing original clinical or epidemiological studies.
Computer programs for computing particle-size statistics of fluvial sediments
Stevens, H.H.; Hubbell, D.W.
1986-01-01
Two versions of computer programs for inputing data and computing particle-size statistics of fluvial sediments are presented. The FORTRAN 77 language versions are for use on the Prime computer, and the BASIC language versions are for use on microcomputers. The size-statistics program compute Inman, Trask , and Folk statistical parameters from phi values and sizes determined for 10 specified percent-finer values from inputed size and percent-finer data. The program also determines the percentage gravel, sand, silt, and clay, and the Meyer-Peter effective diameter. Documentation and listings for both versions of the programs are included. (Author 's abstract)
ERIC Educational Resources Information Center
Gadway, Charles J.; Wilson, H.A.
This document provides statistical data on the 1974 and 1975 Mini-Assessment of Functional Literacy, which was designed to determine the extent of functional literacy among seventeen year olds in America. Also presented are data from comparable test items from the 1971 assessment. Three standards are presented, to allow different methods of…
ERIC Educational Resources Information Center
Novak, Elena; Johnson, Tristan E.; Tenenbaum, Gershon; Shute, Valerie J.
2016-01-01
The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. A storyline is a game-design element that connects scenes with the educational content. In order to…
ERIC Educational Resources Information Center
Waesche, Jessica S. Brown; Schatschneider, Christopher; Maner, Jon K.; Ahmed, Yusra; Wagner, Richard K.
2011-01-01
Rates of agreement among alternative definitions of reading disability and their 1- and 2-year stabilities were examined using a new measure of agreement, the affected-status agreement statistic. Participants were 288,114 first through third grade students. Reading measures were "Dynamic Indicators of Basic Early Literacy Skills" Oral…
ERIC Educational Resources Information Center
Biehler, Rolf; Frischemeier, Daniel; Podworny, Susanne
2017-01-01
Connecting data and chance is fundamental in statistics curricula. The use of software like TinkerPlots can bridge both worlds because the TinkerPlots Sampler supports learners in expressive modeling. We conducted a study with elementary preservice teachers with a basic university education in statistics. They were asked to set up and evaluate…
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…
ERIC Educational Resources Information Center
Novak, Elena
2012-01-01
The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. In addition, the study focused on examining the effects of a storyline GC on specific learning…
A statistical mechanics approach to autopoietic immune networks
NASA Astrophysics Data System (ADS)
Barra, Adriano; Agliari, Elena
2010-07-01
In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.
Mohammed A. Kalkhan; Robin M. Reich; Raymond L. Czaplewski
1996-01-01
A Monte Carlo simulation was used to evaluate the statistical properties of measures of association and the Kappa statistic under double sampling with replacement. Three error matrices representing three levels of classification accuracy of Landsat TM Data consisting of four forest cover types in North Carolina. The overall accuracy of the five indices ranged from 0.35...
Thermodynamic and Kinetic Properties of the Electrochemical Cell.
ERIC Educational Resources Information Center
Smith, Donald E.
1983-01-01
Describes basic characteristics of the electrochemical cell. Also describes basic principles of electrochemical procedures and use of these concepts to explain use of the term "primarily" in discussions of methods primarily responsive to equilibrium cell potential, bulk ohmic resistance, and the Faradaic impedance. (JN)
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Properties of the ion-ion hybrid resonator in fusion plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morales, George J.
2015-10-06
The project developed theoretical and numerical descriptions of the properties of ion-ion hybrid Alfvén resonators that are expected to arise in the operation of a fusion reactor. The methodology and theoretical concepts were successfully compared to observations made in basic experiments in the LAPD device at UCLA. An assessment was made of the excitation of resonator modes by energetic alpha particles for burning plasma conditions expected in the ITER device. The broader impacts included the generation of basic insight useful to magnetic fusion and space science researchers, defining new avenues for exploration in basic laboratory experiments, establishing broader contacts betweenmore » experimentalists and theoreticians, completion of a Ph.D. dissertation, and promotion of interest in science through community outreach events and classroom instruction.« less
Some Statistical Properties of Tonality, 1650-1900
ERIC Educational Resources Information Center
White, Christopher Wm.
2013-01-01
This dissertation investigates the statistical properties present within corpora of common practice music, involving a data set of more than 8,000 works spanning from 1650 to 1900, and focusing specifically on the properties of the chord progressions contained therein. In the first chapter, methodologies concerning corpus analysis are presented…
Fundamentals of biomechanics in tissue engineering of bone.
Athanasiou, K A; Zhu, C; Lanctot, D R; Agrawal, C M; Wang, X
2000-08-01
The objective of this review is to provide basic information pertaining to biomechanical aspects of bone as they relate to tissue engineering. The review is written for the general tissue engineering reader, who may not have a biomechanical engineering background. To this end, biomechanical characteristics and properties of normal and repair cortical and cancellous bone are presented. Also, this chapter intends to describe basic structure-function relationships of these two types of bone. Special emphasis is placed on salient classical and modern testing methods, with both material and structural properties described.
NASA Astrophysics Data System (ADS)
Stošić, Dušan; Auroux, Aline
Basic principles of calorimetry coupled with other techniques are introduced. These methods are used in heterogeneous catalysis for characterization of acidic, basic and red-ox properties of solid catalysts. Estimation of these features is achieved by monitoring the interaction of various probe molecules with the surface of such materials. Overview of gas phase, as well as liquid phase techniques is given. Special attention is devoted to coupled calorimetry-volumetry method. Furthermore, the influence of different experimental parameters on the results of these techniques is discussed, since it is known that they can significantly influence the evaluation of catalytic properties of investigated materials.
Theoretical approaches to the steady-state statistical physics of interacting dissipative units
NASA Astrophysics Data System (ADS)
Bertin, Eric
2017-02-01
The aim of this review is to provide a concise overview of some of the generic approaches that have been developed to deal with the statistical description of large systems of interacting dissipative ‘units’. The latter notion includes, e.g. inelastic grains, active or self-propelled particles, bubbles in a foam, low-dimensional dynamical systems like driven oscillators, or even spatially extended modes like Fourier modes of the velocity field in a fluid. We first review methods based on the statistical properties of a single unit, starting with elementary mean-field approximations, either static or dynamic, that describe a unit embedded in a ‘self-consistent’ environment. We then discuss how this basic mean-field approach can be extended to account for spatial dependences, in the form of space-dependent mean-field Fokker-Planck equations, for example. We also briefly review the use of kinetic theory in the framework of the Boltzmann equation, which is an appropriate description for dilute systems. We then turn to descriptions in terms of the full N-body distribution, starting from exact solutions of one-dimensional models, using a matrix-product ansatz method when correlations are present. Since exactly solvable models are scarce, we also present some approximation methods which can be used to determine the N-body distribution in a large system of dissipative units. These methods include the Edwards approach for dense granular matter and the approximate treatment of multiparticle Langevin equations with colored noise, which models systems of self-propelled particles. Throughout this review, emphasis is put on methodological aspects of the statistical modeling and on formal similarities between different physical problems, rather than on the specific behavior of a given system.
Facts about Congenital Heart Defects
... Living With Heart Defects Data & Statistics Tracking & Research Articles & Key Findings Free Materials Multimedia and Tools Links to Other Websites Information For… Media Policy Makers Basics about Congenital Heart Defects Language: ...
... Cervical Cancer with the Right Test at the Right Time” Infographic How Is Cervical Cancer Diagnosed and Treated? Statistics Related Links Ovarian Cancer Basic Information What Are the Risk Factors? What Can ...
du Prel, Jean-Baptist; Röhrig, Bernd; Blettner, Maria
2009-02-01
In the era of evidence-based medicine, one of the most important skills a physician needs is the ability to analyze scientific literature critically. This is necessary to keep medical knowledge up to date and to ensure optimal patient care. The aim of this paper is to present an accessible introduction into critical appraisal of scientific articles. Using a selection of international literature, the reader is introduced to the principles of critical reading of scientific articles in medicine. For the sake of conciseness, detailed description of statistical methods is omitted. Widely accepted principles for critically appraising scientific articles are outlined. Basic knowledge of study design, structuring of an article, the role of different sections, of statistical presentations as well as sources of error and limitation are presented. The reader does not require extensive methodological knowledge. As far as necessary for critical appraisal of scientific articles, differences in research areas like epidemiology, clinical, and basic research are outlined. Further useful references are presented. Basic methodological knowledge is required to select and interpret scientific articles correctly.
Southeastern Community College Annual Progress Report, December 1995.
ERIC Educational Resources Information Center
Gardner, R. Gene
Presenting information on the status of Southeastern Community College (SCC), in Iowa, this annual progress report highlights basic institutional data, financial information, and improvements and planned changes of the college as of 1995. Part 1 presents basic data on SCC, including facility locations, assessed property valuation, district…
Consequences of common data analysis inaccuracies in CNS trauma injury basic research.
Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K
2013-05-15
The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.
Interpretation of correlations in clinical research.
Hung, Man; Bounsanga, Jerry; Voss, Maren Wright
2017-11-01
Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.
Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)
NASA Astrophysics Data System (ADS)
Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee
2010-12-01
Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review available statistical seismology software packages.
ERIC Educational Resources Information Center
Papaphotis, Georgios; Tsaparlis, Georgios
2008-01-01
Part 1 of the findings are presented of a quantitative study (n = 125) on basic quantum chemical concepts taught in the twelfth grade (age 17-18 years) in Greece. A paper-and-pencil test of fourteen questions was used. The study compared performance in five questions that tested recall of knowledge or application of algorithmic procedures (type-A…
Processing-Microstructure-Property Relationships for Cold Spray Powder Deposition of Al-Cu Alloys
2015-06-01
MICROSTRUCTURE - PROPERTY RELATIONSHIPS FOR COLD SPRAY POWDER DEPOSITION OF Al - Cu ALLOYS by Jeremy D. Leazer June 2015 Thesis Advisor: Sarath K...basic microstructure -mechanical property relationships for cold spray deposited Al - Cu alloy coatings The microstructure of the deposited materials will...the dynamic mechanical
Variational Bayesian Parameter Estimation Techniques for the General Linear Model
Starke, Ludger; Ostwald, Dirk
2017-01-01
Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572
Kamath, Padmaja; Fernandez, Alberto; Giralt, Francesc; Rallo, Robert
2015-01-01
Nanoparticles are likely to interact in real-case application scenarios with mixtures of proteins and biomolecules that will absorb onto their surface forming the so-called protein corona. Information related to the composition of the protein corona and net cell association was collected from literature for a library of surface-modified gold and silver nanoparticles. For each protein in the corona, sequence information was extracted and used to calculate physicochemical properties and statistical descriptors. Data cleaning and preprocessing techniques including statistical analysis and feature selection methods were applied to remove highly correlated, redundant and non-significant features. A weighting technique was applied to construct specific signatures that represent the corona composition for each nanoparticle. Using this basic set of protein descriptors, a new Protein Corona Structure-Activity Relationship (PCSAR) that relates net cell association with the physicochemical descriptors of the proteins that form the corona was developed and validated. The features that resulted from the feature selection were in line with already published literature, and the computational model constructed on these features had a good accuracy (R(2)LOO=0.76 and R(2)LMO(25%)=0.72) and stability, with the advantage that the fingerprints based on physicochemical descriptors were independent of the specific proteins that form the corona.
Bello, Jibril Oyekunle
2013-11-14
Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.
Properties of the Equatorial Magnetotail Flanks ˜50-200 RE Downtail
NASA Astrophysics Data System (ADS)
Artemyev, A. V.; Angelopoulos, V.; Runov, A.; Wang, C.-P.; Zelenyi, L. M.
2017-12-01
In space, thin boundaries separating plasmas with different properties serve as a free energy source for various plasma instabilities and determine the global dynamics of large-scale systems. In planetary magnetopauses and shock waves, classical examples of such boundaries, the magnetic field makes a significant contribution to the pressure balance and plasma dynamics. The configuration and properties of such boundaries have been well investigated and modeled. However, much less is known about boundaries that form between demagnetized plasmas where the magnetic field is not important for pressure balance. The most accessible example of such a plasma boundary is the equatorial boundary layer of the Earth's distant magnetotail. Rather, limited measurements since its first encounter in the late 1970s by the International Sun-Earth Explorer-3 spacecraft revealed the basic properties of this boundary, but its statistical properties and structure have not been studied to date. In this study, we use Geotail and Acceleration, Reconnection, Turbulence and Electrodynamics of the Moon's Interaction with the Sun (ARTEMIS) missions to investigate the equatorial boundary layer from lunar orbit (˜55 Earth radii, RE, downtail) to as far downtail as ˜200 RE. Although the magnetic field has almost no effect on the structure of the boundary layer, the layer separates well the hot, rarefied plasma sheet from dense cold magnetosheath plasmas. We suggest that the most important role in plasma separation is played by polarization electric fields, which modify the efficiency of magnetosheath ion penetration into the plasma sheet. We also show that the total energies (bulk flow plus thermal) of plasma sheet ions and magnetosheath ions are very similar; that is, magnetosheath ion thermalization (e.g., via ion scattering by magnetic field fluctuations) is sufficient to produce hot plasma sheet ions without any additional acceleration.
Separate and Simultaneous Adjustment of Light Qualities in a Real Scene
Pont, Sylvia C.; Heynderick, Ingrid
2017-01-01
Humans are able to estimate light field properties in a scene in that they have expectations of the objects’ appearance inside it. Previously, we probed such expectations in a real scene by asking whether a “probe object” fitted a real scene with regard to its lighting. But how well are observers able to interactively adjust the light properties on a “probe object” to its surrounding real scene? Image ambiguities can result in perceptual interactions between light properties. Such interactions formed a major problem for the “readability” of the illumination direction and diffuseness on a matte smooth spherical probe. We found that light direction and diffuseness judgments using a rough sphere as probe were slightly more accurate than when using a smooth sphere, due to the three-dimensional (3D) texture. We here extended the previous work by testing independent and simultaneous (i.e., the light field properties separated one by one or blended together) adjustments of light intensity, direction, and diffuseness using a rough probe. Independently inferred light intensities were close to the veridical values, and the simultaneously inferred light intensity interacted somewhat with the light direction and diffuseness. The independently inferred light directions showed no statistical difference with the simultaneously inferred directions. The light diffuseness inferences correlated with but contracted around medium veridical values. In summary, observers were able to adjust the basic light properties through both independent and simultaneous adjustments. The light intensity, direction, and diffuseness are well “readable” from our rough probe. Our method allows “tuning the light” (adjustment of its spatial distribution) in interfaces for lighting design or perception research. PMID:28203350
Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun
2016-09-14
Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.
Comparison of holographic setups used in heat and mass transfer measurement
NASA Astrophysics Data System (ADS)
Doleček, R.; Psota, P.; Lédl, V.; Vít, T.; Kopecký, V.
2014-03-01
The authors of the paper deal with measurement of heat and mass transfer for several years and they have frequently used few techniqes for measurement of refractive index distribution based on holographic interferometry. Some of the well known techniques have been modified some and some new ones developped. Every technique could be applied with success in different type of meassurement and obviously every one has set of properties making them unique. We decided to digest few different basic techniques and describe its properties in this paper with the aim to help the reader select the proper one for their measurement. The list of techniques and its properties is not comprehensive but schould serve as a basic orientation in the field.
[Discussion on several basic issues of acupuncture-moxibustion science].
Wang, Guangjun
2016-10-12
Nine basic issues on acupuncture-moxibustion science are discussed in this paper. The author believes those include the universal property of acupoints,the placebo effect of acupuncture and moxibustion,the continuous transmission of acupuncture information,the factors of the effects such as growth as well as acquired shape and properties,the classification evidence of acupoint function,the compatibility of acupoints,the change of functional state of acupoint and deqi . The universal property of acupoints means whether there is identical position of acupoint among different ethnic groups. The continuous transmission of acupuncture information is seen as whether the delivery which mainly shows as diffusion maintains active in special region and situation. The classification evidence of acupoint function refers to if there exists universal biological basis.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
Detailed statistical tables on federal funds for research and development (R&D) activities are provided in this document. Tables are organized into the following sections: research, development, and R&D plant; R&D- agency, character of work, and performer; total research- agency, performer, and field of science; basic research- agency,…
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
Detailed statistical tables showing the funding levels of 92 federal agencies for research and development (R&D) are provided in this document. These tables are organized into the following sections: research, development, and R&D plant; R&D agency, character of work, and performer; total basic and applied applied research--agency,…
WASP (Write a Scientific Paper) using Excel -5: Quartiles and standard deviation.
Grech, Victor
2018-03-01
The almost inevitable descriptive statistics exercise that is undergone once data collection is complete, prior to inferential statistics, requires the acquisition of basic descriptors which may include standard deviation and quartiles. This paper provides pointers as to how to do this in Microsoft Excel™ and explains the relationship between the two. Copyright © 2018 Elsevier B.V. All rights reserved.
The maximum entropy production principle: two basic questions.
Martyushev, Leonid M
2010-05-12
The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
Detailed statistical tables on federal funds for research and development (R&D) are provided in this document. Tables are organized into the following sections: research, development, and R&D plant; R&D--agency, character of work, and performer; total research--agency, performer, and field of science; basic research--agency, performer,…
Current state of the art for statistical modeling of species distributions [Chapter 16
Troy M. Hegel; Samuel A. Cushman; Jeffrey Evans; Falk Huettmann
2010-01-01
Over the past decade the number of statistical modelling tools available to ecologists to model species' distributions has increased at a rapid pace (e.g. Elith et al. 2006; Austin 2007), as have the number of species distribution models (SDM) published in the literature (e.g. Scott et al. 2002). Ten years ago, basic logistic regression (Hosmer and Lemeshow 2000)...
ERIC Educational Resources Information Center
Rahim, Syed A.
Based in part on a list developed by the United Nations Educational, Scientific, and Cultural Organization (UNESCO) for use in Afghanistan, this document presents a comprehensive checklist of items of statistical and descriptive data required for planning a national communication system. It is noted that such a system provides the vital…
Some Basic Techniques in Bioimpedance Research
NASA Astrophysics Data System (ADS)
Martinsen, Ørjan G.
2004-09-01
Any physiological or anatomical changes in a biological material will also change its electrical properties. Hence, bioimpedance measurements can be used for diagnosing or classification of tissue. Applications are numerous within medicine, biology, cosmetics, food industry, sports, etc, and different basic approaches for the development of bioimpedance techniques are discussed in this paper.
49 CFR 24.102 - Basic acquisition policies.
Code of Federal Regulations, 2011 CFR
2011-10-01
... owner in writing. (h) Coercive action. The Agency shall not advance the time of condemnation, or defer... the owner in writing of the Agency's interest in acquiring the real property and the basic protections... appraised, except as provided in § 24.102 (c)(2), and the owner, or the owner's designated representative...
7 CFR 1780.94 - Minimum bond specifications.
Code of Federal Regulations, 2010 CFR
2010-01-01
... by the Government. The Agency address for registration purposes will be that of the Finance Office... from the sale of basic chattel or real estate security, refund of unused loan funds, cash proceeds of property insurance and similar actions which reduce the value of basic security. At the option of the...
Investigating Complexity Using Excel and Visual Basic.
ERIC Educational Resources Information Center
Zetie, K. P.
2001-01-01
Shows how some of the simple ideas in complexity can be investigated using a spreadsheet and a macro written in Visual Basic. Shows how the sandpile model of Bak, Chao, and Wiesenfeld can be simulated and animated. The model produces results that cannot easily be predicted from its properties. (Author/MM)
13 CFR 120.880 - Basic eligibility requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... 120.880 Section 120.880 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) Loan-Making Policies Specific to 504 Loans § 120.880 Basic eligibility... for a 504 loan, a small business must: (a) Use the Project Property (except that an Eligible Passive...
13 CFR 120.880 - Basic eligibility requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
.... 120.880 Section 120.880 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) Loan-Making Policies Specific to 504 Loans § 120.880 Basic eligibility... for a 504 loan, a small business must: (a) Use the Project Property (except that an Eligible Passive...
13 CFR 120.880 - Basic eligibility requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
.... 120.880 Section 120.880 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) Loan-Making Policies Specific to 504 Loans § 120.880 Basic eligibility... for a 504 loan, a small business must: (a) Use the Project Property (except that an Eligible Passive...
13 CFR 120.880 - Basic eligibility requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
.... 120.880 Section 120.880 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) Loan-Making Policies Specific to 504 Loans § 120.880 Basic eligibility... for a 504 loan, a small business must: (a) Use the Project Property (except that an Eligible Passive...
13 CFR 120.880 - Basic eligibility requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
.... 120.880 Section 120.880 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) Loan-Making Policies Specific to 504 Loans § 120.880 Basic eligibility... for a 504 loan, a small business must: (a) Use the Project Property (except that an Eligible Passive...
SIGPI. Fault Tree Cut Set System Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patenaude, C.J.
1992-01-13
SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less
SIGPI. Fault Tree Cut Set System Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patenaude, C.J.
1992-01-14
SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less
Probability sampling in legal cases: Kansas cellphone users
NASA Astrophysics Data System (ADS)
Kadane, Joseph B.
2012-10-01
Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.
... infections—down from 41,800 in 2010. a Gay, bisexual, and other men who have sex with ... HIV infections by transmission category , we see that gay, bisexual, and other men who have sex with ...
Understanding your cancer prognosis
... about: Treatment Palliative care Personal matters such as finances Knowing what to expect may make it easier ... treatment. www.cancer.net/navigating-cancer-care/cancer-basics/understanding-statistics-used-guide-prognosis-and-evaluate-treatment . ...
NASA Technical Reports Server (NTRS)
Wallace, G. R.; Weathers, G. D.; Graf, E. R.
1973-01-01
The statistics of filtered pseudorandom digital sequences called hybrid-sum sequences, formed from the modulo-two sum of several maximum-length sequences, are analyzed. The results indicate that a relation exists between the statistics of the filtered sequence and the characteristic polynomials of the component maximum length sequences. An analysis procedure is developed for identifying a large group of sequences with good statistical properties for applications requiring the generation of analog pseudorandom noise. By use of the analysis approach, the filtering process is approximated by the convolution of the sequence with a sum of unit step functions. A parameter reflecting the overall statistical properties of filtered pseudorandom sequences is derived. This parameter is called the statistical quality factor. A computer algorithm to calculate the statistical quality factor for the filtered sequences is presented, and the results for two examples of sequence combinations are included. The analysis reveals that the statistics of the signals generated with the hybrid-sum generator are potentially superior to the statistics of signals generated with maximum-length generators. Furthermore, fewer calculations are required to evaluate the statistics of a large group of hybrid-sum generators than are required to evaluate the statistics of the same size group of approximately equivalent maximum-length sequences.
Physical properties of forest soils
Charles H. Perry; Michael C. Amacher
2007-01-01
Why Are Physical Properties of the Soil Important? The soil quality indicator, when combined with other data collected by the FIA program, can indicate the current rates of soil erosion, the extent and intensity of soil compaction, and some basic physical properties of the forest floor and the top 20 cm of soil. In this report, two particular physical properties of the...
Variety and volatility in financial markets
NASA Astrophysics Data System (ADS)
Lillo, Fabrizio; Mantegna, Rosario N.
2000-11-01
We study the price dynamics of stocks traded in a financial market by considering the statistical properties of both a single time series and an ensemble of stocks traded simultaneously. We use the n stocks traded on the New York Stock Exchange to form a statistical ensemble of daily stock returns. For each trading day of our database, we study the ensemble return distribution. We find that a typical ensemble return distribution exists in most of the trading days with the exception of crash and rally days and of the days following these extreme events. We analyze each ensemble return distribution by extracting its first two central moments. We observe that these moments fluctuate in time and are stochastic processes, themselves. We characterize the statistical properties of ensemble return distribution central moments by investigating their probability density functions and temporal correlation properties. In general, time-averaged and portfolio-averaged price returns have different statistical properties. We infer from these differences information about the relative strength of correlation between stocks and between different trading days. Last, we compare our empirical results with those predicted by the single-index model and we conclude that this simple model cannot explain the statistical properties of the second moment of the ensemble return distribution.
NASA Astrophysics Data System (ADS)
Kang, Pilsang; Koo, Changhoi; Roh, Hokyu
2017-11-01
Since simple linear regression theory was established at the beginning of the 1900s, it has been used in a variety of fields. Unfortunately, it cannot be used directly for calibration. In practical calibrations, the observed measurements (the inputs) are subject to errors, and hence they vary, thus violating the assumption that the inputs are fixed. Therefore, in the case of calibration, the regression line fitted using the method of least squares is not consistent with the statistical properties of simple linear regression as already established based on this assumption. To resolve this problem, "classical regression" and "inverse regression" have been proposed. However, they do not completely resolve the problem. As a fundamental solution, we introduce "reversed inverse regression" along with a new methodology for deriving its statistical properties. In this study, the statistical properties of this regression are derived using the "error propagation rule" and the "method of simultaneous error equations" and are compared with those of the existing regression approaches. The accuracy of the statistical properties thus derived is investigated in a simulation study. We conclude that the newly proposed regression and methodology constitute the complete regression approach for univariate linear calibrations.
Dexter, Franklin; O'Neill, Liam; Xin, Lei; Ledolter, Johannes
2008-12-01
We use resampling of data to explore the basic statistical properties of super-efficient data envelopment analysis (DEA) when used as a benchmarking tool by the manager of a single decision-making unit. Our focus is the gaps in the outputs (i.e., slacks adjusted for upward bias), as they reveal which outputs can be increased. The numerical experiments show that the estimates of the gaps fail to exhibit asymptotic consistency, a property expected for standard statistical inference. Specifically, increased sample sizes were not always associated with more accurate forecasts of the output gaps. The baseline DEA's gaps equaled the mode of the jackknife and the mode of resampling with/without replacement from any subset of the population; usually, the baseline DEA's gaps also equaled the median. The quartile deviations of gaps were close to zero when few decision-making units were excluded from the sample and the study unit happened to have few other units contributing to its benchmark. The results for the quartile deviations can be explained in terms of the effective combinations of decision-making units that contribute to the DEA solution. The jackknife can provide all the combinations contributing to the quartile deviation and only needs to be performed for those units that are part of the benchmark set. These results show that there is a strong rationale for examining DEA results with a sensitivity analysis that excludes one benchmark hospital at a time. This analysis enhances the quality of decision support using DEA estimates for the potential ofa decision-making unit to grow one or more of its outputs.
Statistical properties of the radiation from SASE FEL operating in the linear regime
NASA Astrophysics Data System (ADS)
Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.
1998-02-01
The paper presents comprehensive analysis of statistical properties of the radiation from self amplified spontaneous emission (SASE) free electron laser operating in linear mode. The investigation has been performed in a one-dimensional approximation, assuming the electron pulse length to be much larger than a coherence length of the radiation. The following statistical properties of the SASE FEL radiation have been studied: field correlations, distribution of the radiation energy after monochromator installed at the FEL amplifier exit and photoelectric counting statistics of SASE FEL radiation. It is shown that the radiation from SASE FEL operating in linear regime possesses all the features corresponding to completely chaotic polarized radiation.
A review of the different techniques for solid surface acid-base characterization.
Sun, Chenhang; Berg, John C
2003-09-18
In this work, various techniques for solid surface acid-base (AB) characterization are reviewed. Different techniques employ different scales to rank acid-base properties. Based on the results from literature and the authors' own investigations for mineral oxides, these scales are compared. The comparison shows that Isoelectric Point (IEP), the most commonly used AB scale, is not a description of the absolute basicity or acidity of a surface, but a description of their relative strength. That is, a high IEP surface shows more basic functionality comparing with its acidic functionality, whereas a low IEP surface shows less basic functionality comparing with its acidic functionality. The choice of technique and scale for AB characterization depends on the specific application. For the cases in which the overall AB property is of interest, IEP (by electrokinetic titration) and H(0,max) (by indicator dye adsorption) are appropriate. For the cases in which the absolute AB property is of interest such as in the study of adhesion, it is more pertinent to use chemical shift (by XPS) and the heat of adsorption of probe gases (by calorimetry or IGC).
Dong, Anjie; Hou, Guoling; Sun, Duoxian
2003-10-15
Amphoteric polyurethane (APU) samples used in this paper were composed of hydrophobic soft segments and pendent -COOH and -CH(2)N(CH(3))(2) groups on the hard segments, which present the properties of both amphoteric polyelectrolytes and amphiphilic block copolymers. APU macromolecules can self-assemble into micelles in acidic and basic aqueous media by hydrophobic/hydrophilic interaction. The self-assembly behavior of APU in acidic and basic media was studied by transmission electron microscopy and light scattering methods. The spherical and hollow micelles of APU were observed respectively in acidic and basic aqueous media. The results indicate that the size and size distribution of APU self-assembly micelles largely depend on the ratio of -COOH to -CH(2)N(CH(3))(2) groups, density of ionizable groups, concentration of APU, and types of acid and base in the media.
Khan, Zia Ullah; Bubnova, Olga; Jafari, Mohammad Javad; Brooke, Robert; Liu, Xianjie; Gabrielsson, Roger; Ederth, Thomas; Evans, Drew R; Andreasen, Jens W; Fahlman, Mats; Crispin, Xavier
2015-10-28
PEDOT-Tos is one of the conducting polymers that displays the most promising thermoelectric properties. Until now, it has been utterly difficult to control all the synthesis parameters and the morphology governing the thermoelectric properties. To improve our understanding of this material, we study the variation in the thermoelectric properties by a simple acido-basic treatment. The emphasis of this study is to elucidate the chemical changes induced by acid (HCl) or base (NaOH) treatment in PEDOT-Tos thin films using various spectroscopic and structural techniques. We could identify changes in the nanoscale morphology due to anion exchange between tosylate and Cl - or OH - . But, we identified that changing the pH leads to a tuning of the oxidation level of the polymer, which can explain the changes in thermoelectric properties. Hence, a simple acid-base treatment allows finding the optimum for the power factor in PEDOT-Tos thin films.
Powerlaw: a Python package for analysis of heavy-tailed distributions.
Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar
2014-01-01
Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.
Properties of lightweight cement-based composites containing waste polypropylene
NASA Astrophysics Data System (ADS)
Záleská, Martina; Pavlíková, Milena; Pavlík, Zbyšek
2016-07-01
Improvement of buildings thermal stability represents an increasingly important trend of the construction industry. This work aims to study the possible use of two types of waste polypropylene (PP) for the development of lightweight cement-based composites with enhanced thermal insulation function. Crushed PP waste originating from the PP tubes production is used for the partial replacement of silica sand by 10, 20, 30, 40 and 50 mass%, whereas a reference mixture without plastic waste is studied as well. First, basic physical and thermal properties of granular PP random copolymer (PPR) and glass fiber reinforced PP (PPGF) aggregate are studied. For the developed composite mixtures, basic physical, mechanical, heat transport and storage properties are accessed. The obtained results show that the composites with incorporated PP aggregate exhibit an improved thermal insulation properties and acceptable mechanical resistivity. This new composite materials with enhanced thermal insulation function are found to be promising materials for buildings subsoil or floor structures.
Statistics and Discoveries at the LHC (1/4)
Cowan, Glen
2018-02-09
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (3/4)
Cowan, Glen
2018-02-19
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (4/4)
Cowan, Glen
2018-05-22
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (2/4)
Cowan, Glen
2018-04-26
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Understanding quantitative research: part 1.
Hoe, Juanita; Hoare, Zoë
This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.
NASA Technical Reports Server (NTRS)
Chiang, T.; Tessarzik, J. M.; Badgley, R. H.
1972-01-01
The primary aim of this investigation was verification of basic methods which are to be used in cataloging elastomer dynamic properties (stiffness and damping) in terms of viscoelastic model constants. These constants may then be used to predict dynamic properties for general elastomer shapes and operating conditions, thereby permitting optimum application of elastomers as energy absorption and/or energy storage devices in the control of vibrations in a broad variety of applications. The efforts reported involved: (1) literature search; (2) the design, fabrication and use of a test rig for obtaining elastomer dynamic test data over a wide range of frequencies, amplitudes, and preloads; and (3) the reduction of the test data, by means of a selected three-element elastomer model and specialized curve fitting techniques, to material properties. Material constants thus obtained have been used to calculate stiffness and damping for comparison with measured test data. These comparisons are excellent for a number of test conditions and only fair to poor for others. The results confirm the validity of the basic approach of the overall program and the mechanics of the cataloging procedure, and at the same time suggest areas in which refinements should be made.
Dexter, Franklin; Shafer, Steven L
2017-03-01
Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.
Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun
2018-01-01
To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.
ERIC Educational Resources Information Center
McMurtry, John
1997-01-01
Criticizes some of the basic principles expounded in John Locke's "Second Treatise on Government." Argues that Locke's ideas on private property, capital investment, and social good are inherently contradictory. Asserts that the market theory of property inevitably leads to endemic economic exploitation and oppression. (MJP)
Grasping the Concept of Personal Property
ERIC Educational Resources Information Center
Constable, Merryn D.; Kritikos, Ada; Bayliss, Andrew P.
2011-01-01
The concept of property is integral to personal and societal development, yet understanding of the cognitive basis of ownership is limited. Objects are the most basic form of property, so our physical interactions with owned objects may elucidate nuanced aspects of ownership. We gave participants a coffee mug to decorate, use and keep. The…
NASA Astrophysics Data System (ADS)
Fu, Jun; Liu, Zhihong; Liu, Jie
2018-01-01
Asphalt Emulsion—Cement Concrete (AECC) is currently considered as a typical semi-flexibility material. One of the disadvantages of this material is brittle fracture and lacking ductility. This study aims at accelerating the basic mechanical properties of AECC using fibers and different aggregates size. The mix of AECC was introduced and the different content of fibers and aggregates size were studied. The results showed that the smaller aggregates size could improve the young’s modulus and compressive strength as well as fiber. The modulus-compressive strength ratio of fiber reinforced AECC is always below 500.
Stress corrosion cracking of titanium alloys
NASA Technical Reports Server (NTRS)
Statler, G. R.; Spretnak, J. W.; Beck, F. H.; Fontana, M. G.
1974-01-01
The effect of hydrogen on the properties of metals, including titanium and its alloys, was investigated. The basic theories of stress corrosion of titanium alloys are reviewed along with the literature concerned with the effect of absorbed hydrogen on the mechanical properties of metals. Finally, the basic modes of metal fracture and their importance to this study is considered. The experimental work was designed to determine the effects of hydrogen concentration on the critical strain at which plastic instability along pure shear directions occurs. The materials used were titanium alloys Ti-8Al-lMo-lV and Ti-5Al-2.5Sn.
Katapultos: Teaching Basic Statistics with Ballistics.
ERIC Educational Resources Information Center
Fitzgerald, Mike
2001-01-01
Describes the use of catapults as a way to increase math, science, and technology correlations within the classroom. Includes detailed instructions, a list of materials for building a catapult, and print and Internet resources. (JOW)
... Hearing Loss Homepage Basics Noise-Induced Hearing Loss Genetics of Hearing Loss Screening & Diagnosis Types of Hearing Loss About Sound Treatment & Intervention Services Learning Language Bacterial Meningitis Studies Data & Statistics EHDI Annual Data 2016 2015 2014 2013 ...
... Hearing Loss Homepage Basics Noise-Induced Hearing Loss Genetics of Hearing Loss Screening & Diagnosis Types of Hearing Loss About Sound Treatment & Intervention Services Learning Language Bacterial Meningitis Studies Data & Statistics EHDI Annual Data 2016 2015 2014 2013 ...
77 FR 61791 - System of Records; Presidential Management Fellows Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
... program personnel for the following reasons: a. To determine basic program eligibility and to evaluate... descriptive statistics and analytical studies in support of the function for which the records are collected...
... Honor Donation Donate by phone at 1-800-DIABETES (1-800-342-2383) Donate by mail Why Give? ... My Health Advisor Tools to Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics ...
... Honor Donation Donate by phone at 1-800-DIABETES (1-800-342-2383) Donate by mail Why Give? ... My Health Advisor Tools to Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics ...
... Honor Donation Donate by phone at 1-800-DIABETES (1-800-342-2383) Donate by mail Why Give? ... My Health Advisor Tools to Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics ...
Using basic statistics on the individual patient's own numeric data.
Hart, John
2012-12-01
This theoretical report gives an example for how coefficient of variation (CV) and quartile analysis (QA) to assess outliers might be able to be used to analyze numeric data in practice for an individual patient. A patient was examined for 8 visits using infrared instrumentation for measurement of mastoid fossa temperature differential (MFTD) readings. The CV and QA were applied to the readings. The participant also completed the Short Form-12 health perception survey on each visit, and these findings were correlated with CV to determine if CV had outcomes support (clinical significance). An outlier MFTD reading was observed on the eighth visit according to QA that coincided with the largest CV value for the MFTDs. Correlations between the Short Form-12 and CV were low to negligible, positive, and statistically nonsignificant. This case provides an example of how basic statistical analyses could possibly be applied to numerical data in chiropractic practice for an individual patient. This might add objectivity to analyzing an individual patient's data in practice, particularly if clinical significance of a clinical numerical finding is unknown.
Basic biostatistics for post-graduate students
Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.
2012-01-01
Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501
Calculation of streamflow statistics for Ontario and the Great Lakes states
Piggott, Andrew R.; Neff, Brian P.
2005-01-01
Basic, flow-duration, and n-day frequency statistics were calculated for 779 current and historical streamflow gages in Ontario and 3,157 streamflow gages in the Great Lakes states with length-of-record daily mean streamflow data ending on December 31, 2000 and September 30, 2001, respectively. The statistics were determined using the U.S. Geological Survey’s SWSTAT and IOWDM, ANNIE, and LIBANNE software and Linux shell and PERL programming that enabled the mass processing of the data and calculation of the statistics. Verification exercises were performed to assess the accuracy of the processing and calculations. The statistics and descriptions, longitudes and latitudes, and drainage areas for each of the streamflow gages are summarized in ASCII text files and ESRI shapefiles.
Eisner, Emily; Drake, Richard; Lobban, Fiona; Bucci, Sandra; Emsley, Richard; Barrowclough, Christine
2018-02-01
Early signs interventions show promise but could be further developed. A recent review suggested that 'basic symptoms' should be added to conventional early signs to improve relapse prediction. This study builds on preliminary evidence that basic symptoms predict relapse and aimed to: 1. examine which phenomena participants report prior to relapse and how they describe them; 2. determine the best way of identifying pre-relapse basic symptoms; 3. assess current practice by comparing self- and casenote-reported pre-relapse experiences. Participants with non-affective psychosis were recruited from UK mental health services. In-depth interviews (n=23), verbal checklists of basic symptoms (n=23) and casenote extracts (n=208) were analysed using directed content analysis and non-parametric statistical tests. Three-quarters of interviewees reported basic symptoms and all reported conventional early signs and 'other' pre-relapse experiences. Interviewees provided rich descriptions of basic symptoms. Verbal checklist interviews asking specifically about basic symptoms identified these experiences more readily than open questions during in-depth interviews. Only 5% of casenotes recorded basic symptoms; interviewees were 16 times more likely to report basic symptoms than their casenotes did. The majority of interviewees self-reported pre-relapse basic symptoms when asked specifically about these experiences but very few casenotes reported these symptoms. Basic symptoms may be potent predictors of relapse that clinicians miss. A self-report measure would aid monitoring of basic symptoms in routine clinical practice and would facilitate a prospective investigation comparing basic symptoms and conventional early signs as predictors of relapse. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Background Information and User’s Guide for MIL-F-9490
1975-01-01
requirements, although different analysis results will apply to each requirement. Basic differences between the two realibility requirements are: MIL-F-8785B...provides the rationale for establishing such limits. The specific risk analysis comprises the same data which formed the average risk analysis , except...statistical analysis will be based on statistical data taken using limited exposure Limes of components and equipment. The exposure times and resulting
Transparency, Accountability, and Engagement: A Recipe for Building Trust in Policing
2017-06-01
Toward Community-orientated Policing: Potential, Basic Requirements, and Threshold Questions,” Crime and Delinquency 33 (1987): 6–30. 49 More, Current...States,” in Sourcebook of Criminal Justice Statistics Online, accessed June 4, 2017, http://www.albany.edu/sourcebook/csv/ t2332011.csv. 89 Gary...to-date crime statistics , and empowered them to think creatively to develop individualized plans to address crime trends and conditions. His focus
Computing Mass Properties From AutoCAD
NASA Technical Reports Server (NTRS)
Jones, A.
1990-01-01
Mass properties of structures computed from data in drawings. AutoCAD to Mass Properties (ACTOMP) computer program developed to facilitate quick calculations of mass properties of structures containing many simple elements in such complex configurations as trusses or sheet-metal containers. Mathematically modeled in AutoCAD or compatible computer-aided design (CAD) system in minutes by use of three-dimensional elements. Written in Microsoft Quick-Basic (Version 2.0).
Uniqueness of Zinc as a Bioelement: Principles and Applications in Bioinorganic Chemistry--III.
ERIC Educational Resources Information Center
Ochiai, Ei-Ichiro
1988-01-01
Attempts to delineate certain basic principles and applications of bioinorganic chemistry to oxidation-reduction reactions. Examines why zinc(II) is so uniquely suited to enzymated reactions of the acid-base type. Suggests the answer may be in the natural abundance and the basic physicochemical properties of zinc(II). (MVL)
Lee, Fu-Jung; Wu, Chih-Cheng; Peng, Shih-Yen; Fan, Kuo-Tung
2007-09-01
Many anesthesiologists in medical centers (MC) or in anesthesiologist-training hospitals (ATH) are accustomed to present their research data in the form of poster abstracts at the annual meetings of Taiwan Society of Anesthesiologists (TSA) to represent their academic gainings in a designated period of time. However, an orphaned P value without mentioning the related specified statistical test has frequently been found in these articles. The difference in presentation of statistical test after P value between MC/ATH and non-MC/non-ATH in recent three TSA consecutive annual meetings was explored in this article. We collected the proceedings handbooks of TSA annual meetings in a period spanning 3 yrs (2003 to 2005) and analyzed the hospital characteristic of first institute-byliner in the poster abstract. Data were analyzed with Fisher's exact test and statistical significance was assumed if P < 0.05. Included were 101 poster abstracts with byliners of 20 hospitals. Only 2 of the 20 hospitals were accredited as non-ATH and 4 as non-MC. There were 64 (63%) abstracts without specified statistical test after P value and no significant difference was found among each category. (P = 0.47 in ATH vs. non-ATH and P = 0.07 in MC vs. non-MC). The basic concept of P value with specified statistical test was not applicable comprehensively in poster abstracts of the annual conferences. Based on our wishful intention, we suggest that the anesthesia administrators and senior anesthesiologists at ATH or MC, and the members of the committee responsible for running academic affairs in TSA, should pay attention to this prodigy and work together to improve our basic statistics in poster presentation.
Development of polytoxicomania in function of defence from psychoticism.
Nenadović, Milutin M; Sapić, Rosa
2011-01-01
Polytoxicomanic proportions in subpopulations of youth have been growing steadily in recent decades, and this trend is pan-continental. Psychoticism is a psychological construct that assumes special basic dimensions of personality disintegration and cognitive functions. Psychoticism may, in general, be the basis of pathological functioning of youth and influence the patterns of thought, feelings and actions that cause dysfunction. The aim of this study was to determine the distribution of basic dimensions of psychoticism for commitment of youth to abuse psychoactive substances (PAS) in order to reduce disturbing intrapsychic experiences or manifestation of psychotic symptoms. For the purpose of this study, two groups of respondents were formed, balanced by age, gender and family structure of origin (at least one parent alive). The study applied a DELTA-9 instrument for assessment of cognitive disintegration in function of establishing psychoticism and its operationalization. The obtained results were statistically analyzed. From the parameters of descriptive statistics, the arithmetic mean was calculated with measures of dispersion. A cross-tabular analysis of variables tested was performed, as well as statistical significance with Pearson's chi2-test, and analysis of variance. Age structure and gender are approximately represented in the group of polytoximaniacs and the control group. Testing did not confirm the statistically significant difference (p > 0.5). Statistical methodology established that they significantly differed in most variables of psychoticism, polytoxicomaniacs compared with a control group of respondents. Testing confirmed a high statistical significance of differences of variables of psychoticism in the group of respondents for p < 0.001 to p < 0.01. A statistically significant representation of the dimension of psychoticism in the polytoxicomaniac group was established. The presence of factors concerning common executive dysfunction was emphasized.
Multi-parametric centrality method for graph network models
NASA Astrophysics Data System (ADS)
Ivanov, Sergei Evgenievich; Gorlushkina, Natalia Nikolaevna; Ivanova, Lubov Nikolaevna
2018-04-01
The graph model networks are investigated to determine centrality, weights and the significance of vertices. For centrality analysis appliesa typical method that includesany one of the properties of graph vertices. In graph theory, methods of analyzing centrality are used: in terms by degree, closeness, betweenness, radiality, eccentricity, page-rank, status, Katz and eigenvector. We have proposed a new method of multi-parametric centrality, which includes a number of basic properties of the network member. The mathematical model of multi-parametric centrality method is developed. Comparison of results for the presented method with the centrality methods is carried out. For evaluate the results for the multi-parametric centrality methodthe graph model with hundreds of vertices is analyzed. The comparative analysis showed the accuracy of presented method, includes simultaneously a number of basic properties of vertices.
Prediction of plasma properties in mercury ion thrusters
NASA Technical Reports Server (NTRS)
Longhurst, G. R.
1978-01-01
A simplified theoretical model was developed which obtains to first order the plasma properties in the discharge chamber of a mercury ion thruster from basic thruster design and controllable operating parameters. The basic operation and design of ion thrusters is discussed, and the important processes which influence the plasma properties are described in terms of the design and control parameters. The conservation for mass, charge and energy were applied to the ion production region, which was defined as the region of the discharge chamber having as its outer boundary the surface of revolution of the innermost field line to intersect the anode. Mass conservation and the equations describing the various processes involved with mass addition and removal from the ion production region are satisfied by a Maxwellian electron density spatial distribution in that region.
NASA Technical Reports Server (NTRS)
Kulkarni, S. V.; Mclaughlin, P. V., Jr.
1978-01-01
An engineering approach is proposed for predicting unnotched/notched laminate fatigue behavior from basic lamina fatigue data. The fatigue analysis procedure was used to determine the laminate property (strength/stiffness) degradation as a function of fatigue cycles in uniaxial tension and in plane shear. These properties were then introduced into the failure model for a notched laminate to obtain damage growth, residual strength, and failure mode. The approach is thus essentially a combination of the cumulative damage accumulation (akin to the Miner-Palmgren hypothesis and its derivatives) and the damage growth rate (similar to the fracture mechanics approach) philosophies. An analysis/experiment correlation appears to confirm the basic postulates of material wearout and the predictability of laminate fatigue properties from lamina fatigue data.
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
The effects of multiple repairs on Inconel 718 weld mechanical properties
NASA Technical Reports Server (NTRS)
Russell, C. K.; Nunes, A. C., Jr.; Moore, D.
1991-01-01
Inconel 718 weldments were repaired 3, 6, 9, and 13 times using the gas tungsten arc welding process. The welded panels were machined into mechanical test specimens, postweld heat treated, and nondestructively tested. Tensile properties and high cycle fatigue life were evaluated and the results compared to unrepaired weld properties. Mechanical property data were analyzed using the statistical methods of difference in means for tensile properties and difference in log means and Weibull analysis for high cycle fatigue properties. Statistical analysis performed on the data did not show a significant decrease in tensile or high cycle fatigue properties due to the repeated repairs. Some degradation was observed in all properties, however, it was minimal.
The bile acid-sensitive ion channel (BASIC) is activated by alterations of its membrane environment.
Schmidt, Axel; Lenzig, Pia; Oslender-Bujotzek, Adrienne; Kusch, Jana; Lucas, Susana Dias; Gründer, Stefan; Wiemuth, Dominik
2014-01-01
The bile acid-sensitive ion channel (BASIC) is a member of the DEG/ENaC family of ion channels. Channels of this family are characterized by a common structure, their physiological functions and modes of activation, however, are diverse. Rat BASIC is expressed in brain, liver and intestinal tract and activated by bile acids. The physiological function of BASIC and its mechanism of bile acid activation remain a puzzle. Here we addressed the question whether amphiphilic bile acids activate BASIC by directly binding to the channel or indirectly by altering the properties of the surrounding membrane. We show that membrane-active substances other than bile acids also affect the activity of BASIC and that activation by bile acids and other membrane-active substances is non-additive, suggesting that BASIC is sensitive for changes in its membrane environment. Furthermore based on results from chimeras between BASIC and ASIC1a, we show that the extracellular and the transmembrane domains are important for membrane sensitivity.
Nonlinear wave chaos: statistics of second harmonic fields.
Zhou, Min; Ott, Edward; Antonsen, Thomas M; Anlage, Steven M
2017-10-01
Concepts from the field of wave chaos have been shown to successfully predict the statistical properties of linear electromagnetic fields in electrically large enclosures. The Random Coupling Model (RCM) describes these properties by incorporating both universal features described by Random Matrix Theory and the system-specific features of particular system realizations. In an effort to extend this approach to the nonlinear domain, we add an active nonlinear frequency-doubling circuit to an otherwise linear wave chaotic system, and we measure the statistical properties of the resulting second harmonic fields. We develop an RCM-based model of this system as two linear chaotic cavities coupled by means of a nonlinear transfer function. The harmonic field strengths are predicted to be the product of two statistical quantities and the nonlinearity characteristics. Statistical results from measurement-based calculation, RCM-based simulation, and direct experimental measurements are compared and show good agreement over many decades of power.
Pressley, Joanna; Troyer, Todd W
2011-05-01
The leaky integrate-and-fire (LIF) is the simplest neuron model that captures the essential properties of neuronal signaling. Yet common intuitions are inadequate to explain basic properties of LIF responses to sinusoidal modulations of the input. Here we examine responses to low and moderate frequency modulations of both the mean and variance of the input current and quantify how these responses depend on baseline parameters. Across parameters, responses to modulations in the mean current are low pass, approaching zero in the limit of high frequencies. For very low baseline firing rates, the response cutoff frequency matches that expected from membrane integration. However, the cutoff shows a rapid, supralinear increase with firing rate, with a steeper increase in the case of lower noise. For modulations of the input variance, the gain at high frequency remains finite. Here, we show that the low-frequency responses depend strongly on baseline parameters and derive an analytic condition specifying the parameters at which responses switch from being dominated by low versus high frequencies. Additionally, we show that the resonant responses for variance modulations have properties not expected for common oscillatory resonances: they peak at frequencies higher than the baseline firing rate and persist when oscillatory spiking is disrupted by high noise. Finally, the responses to mean and variance modulations are shown to have a complementary dependence on baseline parameters at higher frequencies, resulting in responses to modulations of Poisson input rates that are independent of baseline input statistics.
NASA Astrophysics Data System (ADS)
Kiss, I.; Alexa, V.; Serban, S.; Rackov, M.; Čavić, M.
2018-01-01
The cast hipereutectoid steel (usually named Adamite) is a roll manufacturing destined material, having mechanical, chemical properties and Carbon [C] content of which stands between steelandiron, along-withitsalloyelements such as Nickel [Ni], Chrome [Cr], Molybdenum [Mo] and/or other alloy elements. Adamite Rolls are basically alloy steel rolls (a kind of high carbon steel) having hardness ranging from 40 to 55 degrees Shore C, with Carbon [C] percentage ranging from 1.35% until to 2% (usually between 1.2˜2.3%), the extra Carbon [C] and the special alloying element giving an extra wear resistance and strength. First of all the Adamite roll’s prominent feature is the small variation in hardness of the working surface, and has a good abrasion resistance and bite performance. This paper reviews key aspects of roll material properties and presents an analysis of the influences of chemical composition upon the mechanical properties (hardness) of the cast hipereutectoid steel rolls (Adamite). Using the multiple regression analysis (the double and triple regression equations), some mathematical correlations between the cast hipereutectoid steel rolls’ chemical composition and the obtained hardness are presented. In this work several results and evidence obtained by actual experiments are presented. Thus, several variation boundaries for the chemical composition of cast hipereutectoid steel rolls, in view the obtaining the proper values of the hardness, are revealed. For the multiple regression equations, correlation coefficients and graphical representations the software Matlab was used.
Hand-waving and interpretive dance: an introductory course on tensor networks
NASA Astrophysics Data System (ADS)
Bridgeman, Jacob C.; Chubb, Christopher T.
2017-06-01
The curse of dimensionality associated with the Hilbert space of spin systems provides a significant obstruction to the study of condensed matter systems. Tensor networks have proven an important tool in attempting to overcome this difficulty in both the numerical and analytic regimes. These notes form the basis for a seven lecture course, introducing the basics of a range of common tensor networks and algorithms. In particular, we cover: introductory tensor network notation, applications to quantum information, basic properties of matrix product states, a classification of quantum phases using tensor networks, algorithms for finding matrix product states, basic properties of projected entangled pair states, and multiscale entanglement renormalisation ansatz states. The lectures are intended to be generally accessible, although the relevance of many of the examples may be lost on students without a background in many-body physics/quantum information. For each lecture, several problems are given, with worked solutions in an ancillary file.
NASA Technical Reports Server (NTRS)
Hamrock, B. J.; Dowson, D.
1981-01-01
Lubricants, usually Newtonian fluids, are assumed to experience laminar flow. The basic equations used to describe the flow are the Navier-Stokes equation of motion. The study of hydrodynamic lubrication is, from a mathematical standpoint, the application of a reduced form of these Navier-Stokes equations in association with the continuity equation. The Reynolds equation can also be derived from first principles, provided of course that the same basic assumptions are adopted in each case. Both methods are used in deriving the Reynolds equation, and the assumptions inherent in reducing the Navier-Stokes equations are specified. Because the Reynolds equation contains viscosity and density terms and these properties depend on temperature and pressure, it is often necessary to couple the Reynolds with energy equation. The lubricant properties and the energy equation are presented. Film thickness, a parameter of the Reynolds equation, is a function of the elastic behavior of the bearing surface. The governing elasticity equation is therefore presented.
UNSODA UNSATURATED SOIL HYDRAULIC DATABASE USER'S MANUAL VERSION 1.0
This report contains general documentation and serves as a user manual of the UNSODA program. UNSODA is a database of unsaturated soil hydraulic properties (water retention, hydraulic conductivity, and soil water diffusivity), basic soil properties (particle-size distribution, b...
Optical Parametric Amplification of Single Photon: Statistical Properties and Quantum Interference
NASA Astrophysics Data System (ADS)
Xu, Xue-Xiang; Yuan, Hong-Chun
2014-05-01
By using phase space method, we theoretically investigate the quantum statistical properties and quantum interference of optical parametric amplification of single photon. The statistical properties, such as the Wigner function (WF), average photon number, photon number distribution and parity, are derived analytically for the fields of the two output ports. The results indicate that the fields in the output ports are multiphoton states rather than single photon state due to the amplification of the optical parametric amplifiers (OPA). In addition, the phase sensitivity is also examined by using the detection scheme of parity measurement.
Generalized statistical mechanics approaches to earthquakes and tectonics.
Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios
2016-12-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
Instantaneous polarization statistic property of EM waves incident on time-varying reentry plasma
NASA Astrophysics Data System (ADS)
Bai, Bowen; Liu, Yanming; Li, Xiaoping; Yao, Bo; Shi, Lei
2018-06-01
An analytical method is proposed in this paper to study the effect of time-varying reentry plasma sheath on the instantaneous polarization statistic property of electromagnetic (EM) waves. Based on the disturbance property of the hypersonic fluid, the spatial-temporal model of the time-varying reentry plasma sheath is established. An analytical technique referred to as transmission line analogy is developed to calculate the instantaneous transmission coefficient of EM wave propagation in time-varying plasma. Then, the instantaneous polarization statistic theory of EM wave propagation in the time-varying plasma sheath is developed. Taking the S-band telemetry right hand circularly polarized wave as an example, effects of incident angle and plasma parameters, including the electron density and the collision frequency on the EM wave's polarization statistic property are studied systematically. Statistical results indicate that the lower the collision frequency and the larger the electron density and incident angle is, the worse the deterioration of the polarization property is. Meanwhile, in conditions of critical parameters of certain electron density, collision frequency, and incident angle, the transmitted waves have both the right and left hand polarization mode, and the polarization mode will reverse. The calculation results could provide useful information for adaptive polarization receiving of the spacecraft's reentry communication.
Tripathy, Ashis; Pramanik, Sumit; Cho, Jongman; Santhosh, Jayasree; Osman, Noor Azuan Abu
2014-01-01
The humidity sensing characteristics of different sensing materials are important properties in order to monitor different products or events in a wide range of industrial sectors, research and development laboratories as well as daily life. The primary aim of this study is to compare the sensing characteristics, including impedance or resistance, capacitance, hysteresis, recovery and response times, and stability with respect to relative humidity, frequency, and temperature, of different materials. Various materials, including ceramics, semiconductors, and polymers, used for sensing relative humidity have been reviewed. Correlations of the different electrical characteristics of different doped sensor materials as the most unique feature of a material have been noted. The electrical properties of different sensor materials are found to change significantly with the morphological changes, doping concentration of different materials and film thickness of the substrate. Various applications and scopes are pointed out in the review article. We extensively reviewed almost all main kinds of relative humidity sensors and how their electrical characteristics vary with different doping concentrations, film thickness and basic sensing materials. Based on statistical tests, the zinc oxide-based sensing material is best for humidity sensor design since it shows extremely low hysteresis loss, minimum response and recovery times and excellent stability. PMID:25256110