ERIC Educational Resources Information Center
Haas, Stephanie W.; Pattuelli, Maria Cristina; Brown, Ron T.
2003-01-01
Describes the Statistical Interactive Glossary (SIG), an enhanced glossary of statistical terms supported by the GovStat ontology of statistical concepts. Presents a conceptual framework whose components articulate different aspects of a term's basic explanation that can be manipulated to produce a variety of presentations. The overarching…
ERIC Educational Resources Information Center
Orton, Larry
2009-01-01
This document outlines the definitions and the typology now used by Statistics Canada's Centre for Education Statistics to identify, classify and delineate the universities, colleges and other providers of postsecondary and adult education in Canada for which basic enrollments, graduates, professors and finance statistics are produced. These new…
76 FR 41756 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-15
... materials and supplies used in production. The economic census will produce basic statistics by kind of business on number of establishments, sales, payroll, employment, inventories, and operating expenses. It also will yield a variety of subject statistics, including sales by product line; sales by class of...
77 FR 55475 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-10
... collected will be analyzed to produce estimates and basic descriptive statistics on the quantity and type of... mode of data collection by event types, and conduct correlations, cross tabulations of responses and...
Internet starter kit update 1997
DOT National Transportation Integrated Search
1997-01-01
The Bureau of Transportation Statistics (BTS) established an Internet site in 1995, and also produced an Internet Starter Kit not only to assist transportation professionals in accessing the new Internet site but also to give them a basic overview of...
ERIC Educational Resources Information Center
Keller, Rosanne
One of a series of instructional materials produced by the Literacy Council of Alaska, this booklet provides information about motorcycle safety. Using a simplified vocabulary and shorter sentences, it offers statistics concerning motorcycle accidents; information on how to choose the proper machine; basic information about the operation of the…
75 FR 26942 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-13
... Academic Libraries Survey (ALS) provides the basic data needed to produce descriptive statistics for... Education Sciences Type of Review: Reinstatement. Title: Academic Libraries Survey (ALS): 2010-2012... separate biennial survey. The data are collected on the web and consist of information about library...
Reinventing Biostatistics Education for Basic Scientists
Weissgerber, Tracey L.; Garovic, Vesna D.; Milin-Lazovic, Jelena S.; Winham, Stacey J.; Obradovic, Zoran; Trzeciakowski, Jerome P.; Milic, Natasa M.
2016-01-01
Numerous studies demonstrating that statistical errors are common in basic science publications have led to calls to improve statistical training for basic scientists. In this article, we sought to evaluate statistical requirements for PhD training and to identify opportunities for improving biostatistics education in the basic sciences. We provide recommendations for improving statistics training for basic biomedical scientists, including: 1. Encouraging departments to require statistics training, 2. Tailoring coursework to the students’ fields of research, and 3. Developing tools and strategies to promote education and dissemination of statistical knowledge. We also provide a list of statistical considerations that should be addressed in statistics education for basic scientists. PMID:27058055
Space Shuttle Missions Summary
NASA Technical Reports Server (NTRS)
Bennett, Floyd V.; Legler, Robert D.
2011-01-01
This document has been produced and updated over a 21-year period. It is intended to be a handy reference document, basically one page per flight, and care has been exercised to make it as error-free as possible. This document is basically "as flown" data and has been compiled from many sources including flight logs, flight rules, flight anomaly logs, mod flight descent summary, post flight analysis of mps propellants, FDRD, FRD, SODB, and the MER shuttle flight data and inflight anomaly list. Orbit distance traveled is taken from the PAO mission statistics.
World Hunger Crisis Kit. Hope for the Hungry.
ERIC Educational Resources Information Center
Woito, Robert, Ed.
This booklet introduces the problem of world hunger and provides information, facts, and perspectives about the crisis. Section one presents the reader with the basic facts of the hunger crisis through a self-survey, a statistical study of the developed Oil Producing Export Countries (OPEC), and a one-page indication of what one would have to give…
The U.S. Oats Industry. Agricultural Economic Report Number 573.
ERIC Educational Resources Information Center
Hoffman, Linwood A.; Livezey, Janet
This report describes the United States oats industry from producers to consumers and provides a single source of economic and statistical information on oats. Background information on oats is provided first. The report then examines the basic factors of supply, demand, and price to determine what caused the decline in the importance of oats and…
Evaluation of wind field statistics near and inside clouds using a coherent Doppler lidar
NASA Astrophysics Data System (ADS)
Lottman, Brian Todd
1998-09-01
This work proposes advanced techniques for measuring the spatial wind field statistics near and inside clouds using a vertically pointing solid state coherent Doppler lidar on a fixed ground based platform. The coherent Doppler lidar is an ideal instrument for high spatial and temporal resolution velocity estimates. The basic parameters of lidar are discussed, including a complete statistical description of the Doppler lidar signal. This description is extended to cases with simple functional forms for aerosol backscatter and velocity. An estimate for the mean velocity over a sensing volume is produced by estimating the mean spectra. There are many traditional spectral estimators, which are useful for conditions with slowly varying velocity and backscatter. A new class of estimators (novel) is introduced that produces reliable velocity estimates for conditions with large variations in aerosol backscatter and velocity with range, such as cloud conditions. Performance of traditional and novel estimators is computed for a variety of deterministic atmospheric conditions using computer simulated data. Wind field statistics are produced for actual data for a cloud deck, and for multi- layer clouds. Unique results include detection of possible spectral signatures for rain, estimates for the structure function inside a cloud deck, reliable velocity estimation techniques near and inside thin clouds, and estimates for simple wind field statistics between cloud layers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tessore, Nicolas; Metcalf, R. Benton; Winther, Hans A.
A number of alternatives to general relativity exhibit gravitational screening in the non-linear regime of structure formation. We describe a set of algorithms that can produce weak lensing maps of large scale structure in such theories and can be used to generate mock surveys for cosmological analysis. By analysing a few basic statistics we indicate how these alternatives can be distinguished from general relativity with future weak lensing surveys.
Pulpwood production in the Northeast 1969
James T. Bones; David R. Dickson
1970-01-01
This annual report is based on a canvass of all pulpmills in the Northeast that use wood-either from roundwood or plant residues-as a basic raw material for a variety of products. Mills that use pulpwood as a raw material in producing insulation board and hardboard were also included in the canvass. The production-from-roundwood statistics reported in this bulletin are...
Pulpwood production in the Northeast 1970
James T. Bones; David R. Dickson
1971-01-01
This annual report is based on a canvass of all pulpmills in the Northeast that use wood-either roundwood or plant residues-as a basic raw material for a variety of products. Mills that use pulpwood as a raw material in producing insulation board and hardboard were also included in the canvass. The statistics for production from roundwood reported in this bulletin are...
ERIC Educational Resources Information Center
Guthrie, Gerry D.
The objective of this study was to provide the library community with basic statistical data from on-line activity in the Ohio State University Libraries' Circulation System. Over 1.6 million archive records in the circulation system for 1972 were investigated to produce subject reports of circulation activity, activity reports by collection…
ERIC Educational Resources Information Center
Noser, Thomas C.; Tanner, John R.; Shah, Situl
2008-01-01
The purpose of this study was to measure the comprehension of basic mathematical skills of students enrolled in statistics classes at a large regional university, and to determine if the scores earned on a basic math skills test are useful in forecasting student performance in these statistics classes, and to determine if students' basic math…
LaBudde, Robert A; Harnly, James M
2012-01-01
A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.
Freeman, Jenny V; Collier, Steve; Staniforth, David; Smith, Kevin J
2008-01-01
Background Statistics is relevant to students and practitioners in medicine and health sciences and is increasingly taught as part of the medical curriculum. However, it is common for students to dislike and under-perform in statistics. We sought to address these issues by redesigning the way that statistics is taught. Methods The project brought together a statistician, clinician and educational experts to re-conceptualize the syllabus, and focused on developing different methods of delivery. New teaching materials, including videos, animations and contextualized workbooks were designed and produced, placing greater emphasis on applying statistics and interpreting data. Results Two cohorts of students were evaluated, one with old style and one with new style teaching. Both were similar with respect to age, gender and previous level of statistics. Students who were taught using the new approach could better define the key concepts of p-value and confidence interval (p < 0.001 for both). They were more likely to regard statistics as integral to medical practice (p = 0.03), and to expect to use it in their medical career (p = 0.003). There was no significant difference in the numbers who thought that statistics was essential to understand the literature (p = 0.28) and those who felt comfortable with the basics of statistics (p = 0.06). More than half the students in both cohorts felt that they were comfortable with the basics of medical statistics. Conclusion Using a variety of media, and placing emphasis on interpretation can help make teaching, learning and understanding of statistics more people-centred and relevant, resulting in better outcomes for students. PMID:18452599
Adaptive variable-length coding for efficient compression of spacecraft television data.
NASA Technical Reports Server (NTRS)
Rice, R. F.; Plaunt, J. R.
1971-01-01
An adaptive variable length coding system is presented. Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability. Using sample to sample prediction, the coding system produces output rates within 0.25 bit/picture element (pixel) of the one-dimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel. This is accomplished without the necessity of storing any code words. Performance improvements of 0.5 bit/pixel can be simply achieved by utilizing previous line correlation. A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels. The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis. Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system.
The use of quizStar application for online examination in basic physics course
NASA Astrophysics Data System (ADS)
Kustijono, R.; Budiningarti, H.
2018-03-01
The purpose of the study is to produce an online Basic Physics exam system using the QuizStar application. This is a research and development with ADDIE model. The steps are: 1) analysis; 2) design; 3) development; 4) implementation; 5) evaluation. System feasibility is reviewed for its validity, practicality, and effectiveness. The subjects of research are 60 Physics Department students of Universitas Negeri Surabaya. The data analysis used is a descriptive statistic. The validity, practicality, and effectiveness scores are measured using a Likert scale. Criteria feasible if the total score of all aspects obtained is ≥ 61%. The results obtained from the online test system by using QuizStar developed are 1) conceptually feasible to use; 2) the system can be implemented in the Basic Physics assessment process, and the existing constraints can be overcome; 3) student's response to system usage is in a good category. The results conclude that QuizStar application is eligible to be used for online Basic Physics exam system.
Statistical regularities of art images and natural scenes: spectra, sparseness and nonlinearities.
Graham, Daniel J; Field, David J
2007-01-01
Paintings are the product of a process that begins with ordinary vision in the natural world and ends with manipulation of pigments on canvas. Because artists must produce images that can be seen by a visual system that is thought to take advantage of statistical regularities in natural scenes, artists are likely to replicate many of these regularities in their painted art. We have tested this notion by computing basic statistical properties and modeled cell response properties for a large set of digitized paintings and natural scenes. We find that both representational and non-representational (abstract) paintings from our sample (124 images) show basic similarities to a sample of natural scenes in terms of their spatial frequency amplitude spectra, but the paintings and natural scenes show significantly different mean amplitude spectrum slopes. We also find that the intensity distributions of paintings show a lower skewness and sparseness than natural scenes. We account for this by considering the range of luminances found in the environment compared to the range available in the medium of paint. A painting's range is limited by the reflective properties of its materials. We argue that artists do not simply scale the intensity range down but use a compressive nonlinearity. In our studies, modeled retinal and cortical filter responses to the images were less sparse for the paintings than for the natural scenes. But when a compressive nonlinearity was applied to the images, both the paintings' sparseness and the modeled responses to the paintings showed the same or greater sparseness compared to the natural scenes. This suggests that artists achieve some degree of nonlinear compression in their paintings. Because paintings have captivated humans for millennia, finding basic statistical regularities in paintings' spatial structure could grant insights into the range of spatial patterns that humans find compelling.
Statistical methods of estimating mining costs
Long, K.R.
2011-01-01
Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.
Operation and performance of the OSSE instrument
NASA Technical Reports Server (NTRS)
Cameron, R. A.; Kurfess, J. D.; Johnson, W. N.; Kinzer, R. L.; Kroeger, R. A.; Leising, M. D.; Murphy, R. J.; Share, G. H.; Strickman, M. S.; Grove, J. E.
1992-01-01
The Oriented Scintillation Spectrometer Experiment (OSSE) on the Arthur Holly Compton Gamma Ray Observatory is described. An overview of the operation and control of the instrument is given, together with a discussion of typical observing strategies used with OSSE and basic data types produced by the instrument. Some performance measures for the instrument are presented that were obtained from pre-launch and in-flight data. These include observing statistics, continuum and line sensitivity, and detector effective area and gain stability.
A mechanism producing power law etc. distributions
NASA Astrophysics Data System (ADS)
Li, Heling; Shen, Hongjun; Yang, Bin
2017-07-01
Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.
Teaching Basic Probability in Undergraduate Statistics or Management Science Courses
ERIC Educational Resources Information Center
Naidu, Jaideep T.; Sanford, John F.
2017-01-01
Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…
Jelacic, Srdjan; Bowdle, Andrew; Togashi, Kei; VonHomeyer, Peter
2013-08-01
The authors evaluated the educational benefits of using a first-generation HeartWorks simulator to teach senior anesthesiology residents basic echocardiography skills. Prospective observational study. A single academic medical center (teaching hospital). Thirty-seven senior (fourth-year) anesthesiology residents participated in this study. Groups of 3 senior anesthesiology residents participated in a single 3-hour tutorial in the simulation laboratory in the authors' institution during their cardiothoracic anesthesiology rotation. A cardiothoracic anesthesiology faculty member demonstrated the use of the transesophageal echocardiography (TEE) simulator and instructed the residents on obtaining standard TEE views of normal anatomy. Prior to the laboratory session, the residents took an online multiple-choice pretest with 25 questions related to safety, probe manipulation, clinical application, and pathology, which was accompanied by echo images of normal cardiac anatomy and video clips of pathology. Three to four weeks after the TEE tutorial, the residents completed an online post-test and evaluation of the teaching session. There was a statistically significant increase in knowledge of normal echocardiographic anatomy (p = 0.04), with an average improvement in normal echocardiographic anatomy scores of 15%. Virtual reality TEE simulation technology was endorsed strongly by residents, produced a statistically significant improvement in knowledge of normal echocardiographic anatomy, and could be effective for teaching basic echocardiography to anesthesiology residents. Copyright © 2013 Elsevier Inc. All rights reserved.
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less
ERIC Educational Resources Information Center
Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca
2016-01-01
Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…
D-phenylalanine: a putative enkephalinase inhibitor studied in a primate acute pain model.
Halpern, L M; Dong, W K
1986-02-01
D-Phenylalanine, along with morphine, acetylsalicylic acid and zomepirac sodium were evaluated for their antinociceptive actions in monkeys (M. fascicularis) trained to autoregulate nociceptive stimulation using a discrete-trials, aversive-threshold paradigm. Morphine sulfate produced dose-related increases in aversive threshold which were reversible after administration of naloxone (12.5 or 25 micrograms/kg i.m.). D-Phenylalanine (500 mg/kg p.o.) produced a small increase in aversive threshold which was not statistically significant and not naloxone reversible. Acetylsalicylic acid (200 mg/kg p.o.) but not zomepirac sodium (200 mg/kg p.o.) in combination with D-phenylalanine (500 mg/kg) produced a small statistically significant increase in aversive threshold. Our results argue against the hypothesis that D-phenylalanine is responsible for increasing aversive thresholds via opiate receptor mechanisms involving increased activity of enkephalins at synaptic loci. Previous studies by others in rats and mice showed that D-phenylalanine and acetylsalicylic acid produced increases in nociceptive thresholds which were naloxone reversible. Our failure to find opiate receptor mediated analgesia in a primate model with demonstrated opiate receptor selectivity and sensitivity is discussed in terms of previous basic and clinical research indicating an analgesic role for D-phenylalanine. Possible species difference in drug action is discussed in terms of inhibition by D-phenylalanine of carboxy-peptidase-like enkephalin processing enzymes as well as inhibition of carboxypeptidase-like enkephalin degrading enzymes.
How language production shapes language form and comprehension
MacDonald, Maryellen C.
2012-01-01
Language production processes can provide insight into how language comprehension works and language typology—why languages tend to have certain characteristics more often than others. Drawing on work in memory retrieval, motor planning, and serial order in action planning, the Production-Distribution-Comprehension (PDC) account links work in the fields of language production, typology, and comprehension: (1) faced with substantial computational burdens of planning and producing utterances, language producers implicitly follow three biases in utterance planning that promote word order choices that reduce these burdens, thereby improving production fluency. (2) These choices, repeated over many utterances and individuals, shape the distributions of utterance forms in language. The claim that language form stems in large degree from producers' attempts to mitigate utterance planning difficulty is contrasted with alternative accounts in which form is driven by language use more broadly, language acquisition processes, or producers' attempts to create language forms that are easily understood by comprehenders. (3) Language perceivers implicitly learn the statistical regularities in their linguistic input, and they use this prior experience to guide comprehension of subsequent language. In particular, they learn to predict the sequential structure of linguistic signals, based on the statistics of previously-encountered input. Thus, key aspects of comprehension behavior are tied to lexico-syntactic statistics in the language, which in turn derive from utterance planning biases promoting production of comparatively easy utterance forms over more difficult ones. This approach contrasts with classic theories in which comprehension behaviors are attributed to innate design features of the language comprehension system and associated working memory. The PDC instead links basic features of comprehension to a different source: production processes that shape language form. PMID:23637689
Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung
2014-10-01
Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.
Interpretation of the results of statistical measurements. [search for basic probability model
NASA Technical Reports Server (NTRS)
Olshevskiy, V. V.
1973-01-01
For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.
Simplified estimation of age-specific reference intervals for skewed data.
Wright, E M; Royston, P
1997-12-30
Age-specific reference intervals are commonly used in medical screening and clinical practice, where interest lies in the detection of extreme values. Many different statistical approaches have been published on this topic. The advantages of a parametric method are that they necessarily produce smooth centile curves, the entire density is estimated and an explicit formula is available for the centiles. The method proposed here is a simplified version of a recent approach proposed by Royston and Wright. Basic transformations of the data and multiple regression techniques are combined to model the mean, standard deviation and skewness. Using these simple tools, which are implemented in almost all statistical computer packages, age-specific reference intervals may be obtained. The scope of the method is illustrated by fitting models to several real data sets and assessing each model using goodness-of-fit techniques.
NASA Technical Reports Server (NTRS)
Lennington, R. K.; Malek, H.
1978-01-01
A clustering method, CLASSY, was developed, which alternates maximum likelihood iteration with a procedure for splitting, combining, and eliminating the resulting statistics. The method maximizes the fit of a mixture of normal distributions to the observed first through fourth central moments of the data and produces an estimate of the proportions, means, and covariances in this mixture. The mathematical model which is the basic for CLASSY and the actual operation of the algorithm is described. Data comparing the performances of CLASSY and ISOCLS on simulated and actual LACIE data are presented.
ERIC Educational Resources Information Center
Zetterqvist, Lena
2017-01-01
Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…
A basic introduction to statistics for the orthopaedic surgeon.
Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef
2012-02-01
Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values.
Miller, C J; Aiken, S A; Metz, M J
2015-02-01
There can be a disconnect between the level of content covered in undergraduate coursework and the expectations of professional-level faculty of their incoming students. Some basic science faculty members may assume that students have a good knowledge base in the material and neglect to appropriately review, whilst others may spend too much class time reviewing basic material. It was hypothesised that the replacement of introductory didactic physiology lectures with interactive online modules could improve student preparedness prior to lectures. These modules would also allow faculty members to analyse incoming student abilities and save valuable face-to-face class time for alternative teaching strategies. Results indicated that the performance levels of incoming U.S. students were poor (57% average on a pre-test), and students often under-predicted their abilities (by 13% on average). Faculty expectations varied greatly between the different content areas and did not appear to correlate with the actual student performance. Three review modules were created which produced a statistically significant increase in post-test scores (46% increase, P < 0.0001, n = 114-115). The positive results of this study suggest a need to incorporate online review units in the basic science dental school courses and revise introductory material tailored to students' strengths and needs.
Using Data Mining to Teach Applied Statistics and Correlation
ERIC Educational Resources Information Center
Hartnett, Jessica L.
2016-01-01
This article describes two class activities that introduce the concept of data mining and very basic data mining analyses. Assessment data suggest that students learned some of the conceptual basics of data mining, understood some of the ethical concerns related to the practice, and were able to perform correlations via the Statistical Package for…
Simple Data Sets for Distinct Basic Summary Statistics
ERIC Educational Resources Information Center
Lesser, Lawrence M.
2011-01-01
It is important to avoid ambiguity with numbers because unfortunate choices of numbers can inadvertently make it possible for students to form misconceptions or make it difficult for teachers to tell if students obtained the right answer for the right reason. Therefore, it is important to make sure when introducing basic summary statistics that…
Graham, Daniel J; Field, David J
2008-01-01
Two recent studies suggest that natural scenes and paintings show similar statistical properties. But does the content or region of origin of an artwork affect its statistical properties? We addressed this question by having judges place paintings from a large, diverse collection of paintings into one of three subject-matter categories using a forced-choice paradigm. Basic statistics for images whose caterogization was agreed by all judges showed no significant differences between those judged to be 'landscape' and 'portrait/still-life', but these two classes differed from paintings judged to be 'abstract'. All categories showed basic spatial statistical regularities similar to those typical of natural scenes. A test of the full painting collection (140 images) with respect to the works' place of origin (provenance) showed significant differences between Eastern works and Western ones, differences which we find are likely related to the materials and the choice of background color. Although artists deviate slightly from reproducing natural statistics in abstract art (compared to representational art), the great majority of human art likely shares basic statistical limitations. We argue that statistical regularities in art are rooted in the need to make art visible to the eye, not in the inherent aesthetic value of natural-scene statistics, and we suggest that variability in spatial statistics may be generally imposed by manufacture.
NASA Technical Reports Server (NTRS)
1984-01-01
Three mesoscale sounding data sets from the VISSR Atmospheric Sounder (VAS) produced using different retrieval techniques were evaluated of corresponding ground truth rawinsonde data for 6-7 March 1982. Mean, standard deviations, and RMS differences between the satellite and rawinsonde parameters were calculated over gridded fields in central Texas and Oklahoma. Large differences exist between each satellite data set and the ground truth data. Biases in the satellite temperature and moisture profiles seem extremely dependent upon the three dimensional structure of the atmosphere and range from 1 deg to 3 deg C for temperature and 3 deg to 6 deg C for dewpoint temperature. Atmospheric gradients of basic and derived parameters determined from the VAS data sets produced an adequate representation of the mesoscale environment but their magnitudes were often reduced by 30 to 50%.
Optimization Under Uncertainty for Electronics Cooling Design
NASA Astrophysics Data System (ADS)
Bodla, Karthik K.; Murthy, Jayathi Y.; Garimella, Suresh V.
Optimization under uncertainty is a powerful methodology used in design and optimization to produce robust, reliable designs. Such an optimization methodology, employed when the input quantities of interest are uncertain, produces output uncertainties, helping the designer choose input parameters that would result in satisfactory thermal solutions. Apart from providing basic statistical information such as mean and standard deviation in the output quantities, auxiliary data from an uncertainty based optimization, such as local and global sensitivities, help the designer decide the input parameter(s) to which the output quantity of interest is most sensitive. This helps the design of experiments based on the most sensitive input parameter(s). A further crucial output of such a methodology is the solution to the inverse problem - finding the allowable uncertainty range in the input parameter(s), given an acceptable uncertainty range in the output quantity of interest...
Basic statistics (the fundamental concepts).
Lim, Eric
2014-12-01
An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…
From Research to Practice: Basic Mathematics Skills and Success in Introductory Statistics
ERIC Educational Resources Information Center
Lunsford, M. Leigh; Poplin, Phillip
2011-01-01
Based on previous research of Johnson and Kuennen (2006), we conducted a study to determine factors that would possibly predict student success in an introductory statistics course. Our results were similar to Johnson and Kuennen in that we found students' basic mathematical skills, as measured on a test created by Johnson and Kuennen, were a…
A statistical evaluation and comparison of VISSR Atmospheric Sounder (VAS) data
NASA Technical Reports Server (NTRS)
Jedlovec, G. J.
1984-01-01
In order to account for the temporal and spatial discrepancies between the VAS and rawinsonde soundings, the rawinsonde data were adjusted to a common hour of release where the new observation time corresponded to the satellite scan time. Both the satellite and rawinsonde observations of the basic atmospheric parameters (T Td, and Z) were objectively analyzed to a uniform grid maintaining the same mesoscale structure in each data set. The performance of each retrieval algorithm in producing accurate and representative soundings was evaluated using statistical parameters such as the mean, standard deviation, and root mean square of the difference fields for each parameter and grid level. Horizontal structure was also qualitatively evaluated by examining atmospheric features on constant pressure surfaces. An analysis of the vertical structure of the atmosphere were also performed by looking at colocated and grid mean vertical profiles of both the satellite and rawinsonde data sets. Highlights of these results are presented.
Fault detection and diagnosis using neural network approaches
NASA Technical Reports Server (NTRS)
Kramer, Mark A.
1992-01-01
Neural networks can be used to detect and identify abnormalities in real-time process data. Two basic approaches can be used, the first based on training networks using data representing both normal and abnormal modes of process behavior, and the second based on statistical characterization of the normal mode only. Given data representative of process faults, radial basis function networks can effectively identify failures. This approach is often limited by the lack of fault data, but can be facilitated by process simulation. The second approach employs elliptical and radial basis function neural networks and other models to learn the statistical distributions of process observables under normal conditions. Analytical models of failure modes can then be applied in combination with the neural network models to identify faults. Special methods can be applied to compensate for sensor failures, to produce real-time estimation of missing or failed sensors based on the correlations codified in the neural network.
The Basic Course in Communication: A Performance Triad.
ERIC Educational Resources Information Center
Smith, V. A.
The key element to the survival of speech communication and its status in academe is the basic course, which tells the academic community what speech communication is and what it can produce in terms of observable student behavior. This basic course, upon which many communication departments depend, must produce students who are obviously trained…
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Wright, Nicholas J.D.; Alston, Gregory L.
2015-01-01
Objective. To design and assess a horizontally integrated biological sciences course sequence and to determine its effectiveness in imparting the foundational science knowledge necessary to successfully progress through the pharmacy school curriculum and produce competent pharmacy school graduates. Design. A 2-semester course sequence integrated principles from several basic science disciplines: biochemistry, molecular biology, cellular biology, anatomy, physiology, and pathophysiology. Each is a 5-credit course taught 5 days per week, with 50-minute class periods. Assessment. Achievement of outcomes was determined with course examinations, student lecture, and an annual skills mastery assessment. The North American Pharmacist Licensure Examination (NAPLEX) results were used as an indicator of competency to practice pharmacy. Conclusion. Students achieved course objectives and program level outcomes. The biological sciences integrated course sequence was successful in providing students with foundational basic science knowledge required to progress through the pharmacy program and to pass the NAPLEX. The percentage of the school’s students who passed the NAPLEX was not statistically different from the national percentage. PMID:26430276
Partsch, B; Partsch, H
2008-01-01
The aim of this study was to measure the interface pressure of a newly designed two-layer compression stocking (Mediven ulcer kit Medi QMBH, Bayreuth, Germany) in different body positions and to compare the values with those obtained with another two-layer product. Interface pressure was measured on the distal medial leg in 16 legs of volunteers, with the basic layer alone and with the whole stocking kit in the supine, sitting and standing position for both stocking systems. The literature concerning ulcer-healing rates is reviewed. Mediven ulcerkit produced statistically significant higher pressure values than the ulcer stocking with a median resting value of 35.5 mmHg in the supine and 42.5 mmHg in the standing position. The pressure while standing comes close to values exerted by bandages. The basic layer alone applies a pressure of 20.5 mmHg. Especially designed compression stockings exerting sufficient interface pressure may be indicated in patients with small ulcers of short duration.
A Multidisciplinary Approach for Teaching Statistics and Probability
ERIC Educational Resources Information Center
Rao, C. Radhakrishna
1971-01-01
The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)
Applications of statistics to medical science (1) Fundamental concepts.
Watanabe, Hiroshi
2011-01-01
The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.
ERIC Educational Resources Information Center
Ragasa, Carmelita Y.
2008-01-01
The objective of the study is to determine if there is a significant difference in the effects of the treatment and control groups on achievement as well as on attitude as measured by the posttest. A class of 38 sophomore college students in the basic statistics taught with the use of computer-assisted instruction and another class of 15 students…
Back to basics: an introduction to statistics.
Halfens, R J G; Meijers, J M M
2013-05-01
In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.
Reasoning strategies modulate gender differences in emotion processing.
Markovits, Henry; Trémolière, Bastien; Blanchette, Isabelle
2018-01-01
The dual strategy model of reasoning has proposed that people's reasoning can be understood asa combination of two different ways of processing information related to problem premises: a counterexample strategy that examines information for explicit potential counterexamples and a statistical strategy that uses associative access to generate a likelihood estimate of putative conclusions. Previous studies have examined this model in the context of basic conditional reasoning tasks. However, the information processing distinction that underlies the dual strategy model can be seen asa basic description of differences in reasoning (similar to that described by many general dual process models of reasoning). In two studies, we examine how these differences in reasoning strategy may relate to processing very different information, specifically we focus on previously observed gender differences in processing negative emotions. Study 1 examined the intensity of emotional reactions to a film clip inducing primarily negative emotions. Study 2 examined the speed at which participants determine the emotional valence of sequences of negative images. In both studies, no gender differences were observed among participants using a counterexample strategy. Among participants using a statistical strategy, females produce significantly stronger emotional reactions than males (in Study 1) and were faster to recognize the valence of negative images than were males (in Study 2). Results show that the processing distinction underlying the dual strategy model of reasoning generalizes to the processing of emotions. Copyright © 2017 Elsevier B.V. All rights reserved.
Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein
2012-01-01
The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts' opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry.
Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein
2012-01-01
The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts’ opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry. PMID:24250442
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Methods of making metal oxide nanostructures and methods of controlling morphology of same
Wong, Stanislaus S; Hongjun, Zhou
2012-11-27
The present invention includes a method of producing a crystalline metal oxide nanostructure. The method comprises providing a metal salt solution and providing a basic solution; placing a porous membrane between the metal salt solution and the basic solution, wherein metal cations of the metal salt solution and hydroxide ions of the basic solution react, thereby producing a crystalline metal oxide nanostructure.
Fish: A New Computer Program for Friendly Introductory Statistics Help
ERIC Educational Resources Information Center
Brooks, Gordon P.; Raffle, Holly
2005-01-01
All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…
Reflexion on linear regression trip production modelling method for ensuring good model quality
NASA Astrophysics Data System (ADS)
Suprayitno, Hitapriya; Ratnasari, Vita
2017-11-01
Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.
Microcrystallography using single-bounce monocapillary optics
Gillilan, R. E.; Cook, M. J.; Cornaby, S. W.; Bilderback, D. H.
2010-01-01
X-ray microbeams have become increasingly valuable in protein crystallography. A number of synchrotron beamlines worldwide have adapted to handling smaller and more challenging samples by providing a combination of high-precision sample-positioning hardware, special visible-light optics for sample visualization, and small-diameter X-ray beams with low background scatter. Most commonly, X-ray microbeams with diameters ranging from 50 µm to 1 µm are produced by Kirkpatrick and Baez mirrors in combination with defining apertures and scatter guards. A simple alternative based on single-bounce glass monocapillary X-ray optics is presented. The basic capillary design considerations are discussed and a practical and robust implementation that capitalizes on existing beamline hardware is presented. A design for mounting the capillary is presented which eliminates parasitic scattering and reduces deformations of the optic to a degree suitable for use on next-generation X-ray sources. Comparison of diffraction data statistics for microcrystals using microbeam and conventional aperture-collimated beam shows that capillary-focused beam can deliver significant improvement. Statistics also confirm that the annular beam profile produced by the capillary optic does not impact data quality in an observable way. Examples are given of new structures recently solved using this technology. Single-bounce monocapillary optics can offer an attractive alternative for retrofitting existing beamlines for microcrystallography. PMID:20157276
CADDIS Volume 4. Data Analysis: Basic Principles & Issues
Use of inferential statistics in causal analysis, introduction to data independence and autocorrelation, methods to identifying and control for confounding variables, references for the Basic Principles section of Data Analysis.
Are We Able to Pass the Mission of Statistics to Students?
ERIC Educational Resources Information Center
Hindls, Richard; Hronová, Stanislava
2015-01-01
The article illustrates our long term experience in teaching statistics for non-statisticians, especially for students of economics and humanities. The article is focused on some problems of the basic course that can weaken the interest in statistics or lead to false use of statistic methods.
Lee, Seul Gi; Shin, Yun Hee
2016-04-01
This study was done to verify effects of a self-directed feedback practice using smartphone videos on nursing students' basic nursing skills, confidence in performance and learning satisfaction. In this study an experimental study with a post-test only control group design was used. Twenty-nine students were assigned to the experimental group and 29 to the control group. Experimental treatment was exchanging feedback on deficiencies through smartphone recorded videos of nursing practice process taken by peers during self-directed practice. Basic nursing skills scores were higher for all items in the experimental group compared to the control group, and differences were statistically significant ["Measuring vital signs" (t=-2.10, p=.039); "Wearing protective equipment when entering and exiting the quarantine room and the management of waste materials" (t=-4.74, p<.001) "Gavage tube feeding" (t=-2.70, p=.009)]. Confidence in performance was higher in the experimental group compared to the control group, but the differences were not statistically significant. However, after the complete practice, there was a statistically significant difference in overall performance confidence (t=-3.07. p=.003). Learning satisfaction was higher in the experimental group compared to the control group, but the difference was not statistically significant (t=-1.67, p=.100). Results of this study indicate that self-directed feedback practice using smartphone videos can improve basic nursing skills. The significance is that it can help nursing students gain confidence in their nursing skills for the future through improvement of basic nursing skills and performance of quality care, thus providing patients with safer care.
Provision of Pre-Primary Education as a Basic Right in Tanzania: Reflections from Policy Documents
ERIC Educational Resources Information Center
Mtahabwa, Lyabwene
2010-01-01
This study sought to assess provision of pre-primary education in Tanzania as a basic right through analyses of relevant policy documents. Documents which were published over the past decade were considered, including educational policies, action plans, national papers, the "Basic Education Statistics in Tanzania" documents, strategy…
A crash course on data analysis in asteroseismology
NASA Astrophysics Data System (ADS)
Appourchaux, Thierry
2014-02-01
In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.
On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics
NASA Astrophysics Data System (ADS)
Busch, Paul; Quadt, Ralf
1990-10-01
Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2015-01-30
Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.
Center for Prostate Disease Research
... 2017 Cancer Statistics programs Clinical Research Program Synopsis Leadership Multi-Disciplinary Clinic Staff Listing 2017 Cancer Statistics Basic Science Research Program Synopsis Leadership Gene Expression Data Research Achievements Staff Listing Lab ...
Basic Aerospace Education Library
ERIC Educational Resources Information Center
Journal of Aerospace Education, 1975
1975-01-01
Lists the most significant resource items on aerospace education which are presently available. Includes source books, bibliographies, directories, encyclopedias, dictionaries, audiovisuals, curriculum/planning guides, aerospace statistics, aerospace education statistics and newsletters. (BR)
Multiple-solution problems in a statistics classroom: an example
NASA Astrophysics Data System (ADS)
Chu, Chi Wing; Chan, Kevin L. T.; Chan, Wai-Sum; Kwong, Koon-Shing
2017-11-01
The mathematics education literature shows that encouraging students to develop multiple solutions for given problems has a positive effect on students' understanding and creativity. In this paper, we present an example of multiple-solution problems in statistics involving a set of non-traditional dice. In particular, we consider the exact probability mass distribution for the sum of face values. Four different ways of solving the problem are discussed. The solutions span various basic concepts in different mathematical disciplines (sample space in probability theory, the probability generating function in statistics, integer partition in basic combinatorics and individual risk model in actuarial science) and thus promotes upper undergraduate students' awareness of knowledge connections between their courses. All solutions of the example are implemented using the R statistical software package.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2015-01-01
Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.
Validation and Improvement of SRTM Performance over Rugged Terrain
NASA Technical Reports Server (NTRS)
Zebker, Howard A.
2004-01-01
We have previously reported work related to basic technique development in phase unwrapping and generation of digital elevation models (DEM). In the final year of this work we have applied our technique work to the improvement of DEM's produced by SRTM. In particular, we have developed a rigorous mathematical algorithm and means to fill in missing data over rough terrain from other data sets. We illustrate this method by using a higher resolution, but globally less accurate, DEM produced by the TOPSAR airborne instrument over the Galapagos Islands to augment the SRTM data set in this area, We combine this data set with SRTM to use each set to fill in holes left over by the other imaging system. The infilling is done by first interpolating each data set using a prediction error filter that reproduces the same statistical characterization as exhibited by the entire data set within the interpolated region. After this procedure is implemented on each data set, the two are combined on a point by point basis with weights that reflect the accuracy of each data point in its original image. In areas that are better covered by SRTM, TOPSAR data are weighted down but still retain TOPSAR statistics. The reverse is true for regions better covered by TOPSAR. The resulting DEM passes statistical tests and appears quite feasible to the eye, but as this DEM is the best available for the region we cannot fully veri@ its accuracy. Spot checks with GPS points show that locally the technique results in a more comprehensive and accurate map than either data set alone.
The Statistical Power of Planned Comparisons.
ERIC Educational Resources Information Center
Benton, Roberta L.
Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…
Contextualizing Embodied Resources in Global Food Trade
NASA Astrophysics Data System (ADS)
MacDonald, G. K.; Brauman, K. A.; Sun, S.; West, P. C.; Carlson, K. M.; Cassidy, E. S.; Gerber, J. S.; Ray, D. K.
2014-12-01
Trade in agricultural commodities has created increasingly complex linkages between resource use and food supplies across national borders. Understanding the degree to which food production and consumption relies on trade is vital to understanding how to sustainably meet growing food demands across scales. We use detailed bilateral trade statistics and data on agricultural management to examine the land use and water consumption embodied in agricultural trade, which we relate to basic nutritional indicators to show how trade contributes to food availability worldwide. Agricultural trade carries enough calories to provide >1.7 billion people a basic diet each year. We identify key commodities and producer-consumer relationships that disproportionately contribute to embodied resource use and flows of food nutrition at the global scale. For example, just 15 disproportionately large soybean trades comprised ~10% the total harvested area embodied in export production. We conclude by framing these results in terms of the fraction of each country's food production and consumption that is linked to international trade. These findings help to characterize how countries allocate resources to domestic versus foreign food demand.
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 5 2010-07-01 2010-07-01 false Requests from the Bureau of Labor Statistics for data. 1904... Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses Form from the Bureau of Labor Statistics (BLS), or a BLS designee, you must promptly complete the form...
78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...
ERIC Educational Resources Information Center
Williams, Immanuel James; Williams, Kelley Kim
2016-01-01
Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.
Etching of moldavities under natural conditions
NASA Technical Reports Server (NTRS)
Knobloch, V.; Knoblochova, Z.; Urbanec, Z.
1983-01-01
The hypothesis that a part of the lechatellierites which originated by etching from a basic moldavite mass became broken off after deposition of moldavite in the sedimentation layer is advanced. Those found close to the original moldavite were measured for statistical averaging of length. The average length of lechatelierite fibers per cubic mm of moldavite mass volume was determined by measurement under a microscope in toluene. The data were used to calculate the depth of the moldavite layer that had to be etched to produce the corresponding amount of lechatelierite fragments. The calculations from five "fields" of moldavite surface, where layers of fixed lechatelierite fragments were preserved, produced values of 2.0, 3.1, 3.5, 3.9 and 4.5. Due to inadvertent loss of some fragments the determined values are somewhat lower than those found in references. The difference may be explained by the fact that the depth of the layer is only that caused by etching after moldavite deposition.
Agundu, Prince Umor C
2003-01-01
Public health dispensaries in Nigeria in recent times have demonstrated the poise to boost corporate productivity in the new millennium and to drive the nation closer to concretising the lofty goal of health-for-all. This is very pronounced considering the face-lift giving to the physical environment, increase in the recruitment and development of professionals, and upward review of financial subventions. However, there is little or no emphasis on basic statistical appreciation/application which enhances the decision making ability of corporate executives. This study used the responses from 120 senior public health officials in Nigeria and analyzed them with chi-square statistical technique. The results established low statistical aptitude, inadequate statistical training programmes, little/no emphasis on statistical literacy compared to computer literacy, amongst others. Consequently, it was recommended that these lapses be promptly addressed to enhance official executive performance in the establishments. Basic statistical data presentation typologies have been articulated in this study to serve as first-aid instructions to the target group, as they represent the contributions of eminent scholars in this area of intellectualism.
Complexity transitions in global algorithms for sparse linear systems over finite fields
NASA Astrophysics Data System (ADS)
Braunstein, A.; Leone, M.; Ricci-Tersenghi, F.; Zecchina, R.
2002-09-01
We study the computational complexity of a very basic problem, namely that of finding solutions to a very large set of random linear equations in a finite Galois field modulo q. Using tools from statistical mechanics we are able to identify phase transitions in the structure of the solution space and to connect them to the changes in the performance of a global algorithm, namely Gaussian elimination. Crossing phase boundaries produces a dramatic increase in memory and CPU requirements necessary for the algorithms. In turn, this causes the saturation of the upper bounds for the running time. We illustrate the results on the specific problem of integer factorization, which is of central interest for deciphering messages encrypted with the RSA cryptosystem.
ERIC Educational Resources Information Center
Shihua, Peng; Rihui, Tan
2009-01-01
Employing statistical analysis, this study has made a preliminary exploration of promoting the equitable development of basic education in underdeveloped counties through the case study of Cili county. The unequally developed basic education in the county has been made clear, the reasons for the inequitable education have been analyzed, and,…
The space of ultrametric phylogenetic trees.
Gavryushkin, Alex; Drummond, Alexei J
2016-08-21
The reliability of a phylogenetic inference method from genomic sequence data is ensured by its statistical consistency. Bayesian inference methods produce a sample of phylogenetic trees from the posterior distribution given sequence data. Hence the question of statistical consistency of such methods is equivalent to the consistency of the summary of the sample. More generally, statistical consistency is ensured by the tree space used to analyse the sample. In this paper, we consider two standard parameterisations of phylogenetic time-trees used in evolutionary models: inter-coalescent interval lengths and absolute times of divergence events. For each of these parameterisations we introduce a natural metric space on ultrametric phylogenetic trees. We compare the introduced spaces with existing models of tree space and formulate several formal requirements that a metric space on phylogenetic trees must possess in order to be a satisfactory space for statistical analysis, and justify them. We show that only a few known constructions of the space of phylogenetic trees satisfy these requirements. However, our results suggest that these basic requirements are not enough to distinguish between the two metric spaces we introduce and that the choice between metric spaces requires additional properties to be considered. Particularly, that the summary tree minimising the square distance to the trees from the sample might be different for different parameterisations. This suggests that further fundamental insight is needed into the problem of statistical consistency of phylogenetic inference methods. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Multilaboratory evaluation of methods for detecting enteric viruses in soils.
Hurst, C J; Schaub, S A; Sobsey, M D; Farrah, S R; Gerba, C P; Rose, J B; Goyal, S M; Larkin, E P; Sullivan, R; Tierney, J T
1991-01-01
Two candidate methods for the recovery and detection of viruses in soil were subjected to round robin comparative testing by members of the American Society for Testing and Materials D19:24:04:04 Subcommittee Task Group. Selection of the methods, designated "Berg" and "Goyal," was based on results of an initial screening which indicated that both met basic criteria considered essential by the task group. Both methods utilized beef extract solutions to achieve desorption and recovery of viruses from representative soils: a fine sand soil, an organic muck soil, a sandy loam soil, and a clay loam soil. One of the two methods, Goyal, also used a secondary concentration of resulting soil eluants via low-pH organic flocculation to achieve a smaller final assay volume. Evaluation of the two methods was simultaneously performed in replicate by nine different laboratories. Each of the produced samples was divided into portions, and these were respectively subjected to quantitative viral plaque assay by both the individual, termed independent, laboratory which had done the soil processing and a single common reference laboratory, using a single cell line and passage level. The Berg method seemed to produce slightly higher virus recovery values; however, the differences in virus assay titers for samples produced by the two methods were not statistically significant (P less than or equal to 0.05) for any one of the four soils. Despite this lack of a method effect, there was a statistically significant laboratory effect exhibited by assay titers from the independent versus reference laboratories for two of the soils, sandy loam and clay loam. PMID:1849712
Multilaboratory evaluation of methods for detecting enteric viruses in soils.
Hurst, C J; Schaub, S A; Sobsey, M D; Farrah, S R; Gerba, C P; Rose, J B; Goyal, S M; Larkin, E P; Sullivan, R; Tierney, J T
1991-02-01
Two candidate methods for the recovery and detection of viruses in soil were subjected to round robin comparative testing by members of the American Society for Testing and Materials D19:24:04:04 Subcommittee Task Group. Selection of the methods, designated "Berg" and "Goyal," was based on results of an initial screening which indicated that both met basic criteria considered essential by the task group. Both methods utilized beef extract solutions to achieve desorption and recovery of viruses from representative soils: a fine sand soil, an organic muck soil, a sandy loam soil, and a clay loam soil. One of the two methods, Goyal, also used a secondary concentration of resulting soil eluants via low-pH organic flocculation to achieve a smaller final assay volume. Evaluation of the two methods was simultaneously performed in replicate by nine different laboratories. Each of the produced samples was divided into portions, and these were respectively subjected to quantitative viral plaque assay by both the individual, termed independent, laboratory which had done the soil processing and a single common reference laboratory, using a single cell line and passage level. The Berg method seemed to produce slightly higher virus recovery values; however, the differences in virus assay titers for samples produced by the two methods were not statistically significant (P less than or equal to 0.05) for any one of the four soils. Despite this lack of a method effect, there was a statistically significant laboratory effect exhibited by assay titers from the independent versus reference laboratories for two of the soils, sandy loam and clay loam.
Leal-Soto, Francisco; Carmona-Halty, Marcos; Ferrer-Urbina, Rodrigo
2016-01-01
Background Traumatic experiences, such as natural disasters, produce multiple and serious impacts on people. Despite the traditional focus on negative consequences, in many cases there are also positive consequences, such as posttraumatic growth. Tedeschi and Calhoun proposed a model of posttraumatic growth that emphasizes the role of rumination after the basic beliefs breakdown due to the occurrence of a traumatic experience. Method A total of 238 volunteers affected by two major earthquakes and tsunami alerts in northern Chile on April 1 and 2, 2014, responded to an online survey measuring subjective severity, basic beliefs change, social share of emotion, rumination, posttraumatic stress, and posttraumatic growth. Results Path analyses reveal that posttraumatic stress goes through a negative change in basic beliefs, intrusive rumination, and deliberated rumination, meanwhile posttraumatic growth is only achieved directly from a positive change in basic beliefs and deliberated rumination. Discussion The model is consistent with the empirical model obtained in Chilean people affected by the earthquake and tsunami that occurred on 27 February, 2010, but it is slightly different and in a form that is more consistent with Tedeschi and Calhoun’s theoretical model. Both models remark on the role of deliberated rumination in posttraumatic growth and failure to progress from intrusive to deliberated rumination in posttraumatic stress, but the proposed one is more parsimonious and assumes subjective severity as an antecedent to basic belief changes. These conclusions must be considered in light of limitations that a cross-sectional design and the correlational nature of the statistical analysis carried out impose. Highlights of the article Role of subjective severity, change of basic beliefs, social sharing of emotion, and rumination on posttraumatic stress and growth were modeled from responses of people affected by the April 1–2, 2014, northern Chilean earthquakes.Posttraumatic stress goes through negative changes in basic beliefs, intrusive rumination, and deliberated rumination.Posttraumatic growth is achieved from positive changes in basic beliefs and deliberated rumination.Deliberated rumination and moving from intrusive to deliberated rumination appear as cornerstones in posttraumatic processing. PMID:27900935
Educating the Educator: U.S. Government Statistical Sources for Geographic Research and Teaching.
ERIC Educational Resources Information Center
Fryman, James F.; Wilkinson, Patrick J.
Appropriate for college geography students and researchers, this paper briefly introduces basic federal statistical publications and corresponding finding aids. General references include "Statistical Abstract of the United States," and three complementary publications: "County and City Data Book,""State and Metropolitan Area Data Book," and…
Statistical Cost Estimation in Higher Education: Some Alternatives.
ERIC Educational Resources Information Center
Brinkman, Paul T.; Niwa, Shelley
Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
Ethical Statistics and Statistical Ethics: Making an Interdisciplinary Module
ERIC Educational Resources Information Center
Lesser, Lawrence M.; Nordenhaug, Erik
2004-01-01
This article describes an innovative curriculum module the first author created on the two-way exchange between statistics and applied ethics. The module, having no particular mathematical prerequisites beyond high school algebra, is part of an undergraduate interdisciplinary ethics course which begins with a 3-week introduction to basic applied…
ERIC Educational Resources Information Center
North, Delia; Gal, Iddo; Zewotir, Temesgen
2014-01-01
This paper aims to contribute to the emerging literature on capacity-building in statistics education by examining issues pertaining to the readiness of teachers in a developing country to teach basic statistical topics. The paper reflects on challenges and barriers to building statistics capacity at grass-roots level in a developing country,…
Impact of basic angle variations on the parallax zero point for a scanning astrometric satellite
NASA Astrophysics Data System (ADS)
Butkevich, Alexey G.; Klioner, Sergei A.; Lindegren, Lennart; Hobbs, David; van Leeuwen, Floor
2017-07-01
Context. Determination of absolute parallaxes by means of a scanning astrometric satellite such as Hipparcos or Gaia relies on the short-term stability of the so-called basic angle between the two viewing directions. Uncalibrated variations of the basic angle may produce systematic errors in the computed parallaxes. Aims: We examine the coupling between a global parallax shift and specific variations of the basic angle, namely those related to the satellite attitude with respect to the Sun. Methods: The changes in observables produced by small perturbations of the basic angle, attitude, and parallaxes were calculated analytically. We then looked for a combination of perturbations that had no net effect on the observables. Results: In the approximation of infinitely small fields of view, it is shown that certain perturbations of the basic angle are observationally indistinguishable from a global shift of the parallaxes. If these kinds of perturbations exist, they cannot be calibrated from the astrometric observations but will produce a global parallax bias. Numerical simulations of the astrometric solution, using both direct and iterative methods, confirm this theoretical result. For a given amplitude of the basic angle perturbation, the parallax bias is smaller for a larger basic angle and a larger solar aspect angle. In both these respects Gaia has a more favourable geometry than Hipparcos. In the case of Gaia, internal metrology is used to monitor basic angle variations. Additionally, Gaia has the advantage of detecting numerous quasars, which can be used to verify the parallax zero point.
County-by-County Financial and Staffing I-M-P-A-C-T. FY 1994-95 Basic Education Program.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh.
This publication provides the basic statistics needed to illustrate the impact of North Carolina's Basic Education Program (BEP), an educational reform effort begun in 1985. Over 85% of the positions in the BEP are directly related to teaching and student-related activities. The new BEP programs result in smaller class sizes in kindergartens and…
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2005-10-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.
ERIC Educational Resources Information Center
Center for Education Statistics (ED/OERI), Washington, DC.
The Financial Statistics machine-readable data file (MRDF) is a subfile of the larger Higher Education General Information Survey (HEGIS). It contains basic financial statistics for over 3,000 institutions of higher education in the United States and its territories. The data are arranged sequentially by institution, with institutional…
The Greyhound Strike: Using a Labor Dispute to Teach Descriptive Statistics.
ERIC Educational Resources Information Center
Shatz, Mark A.
1985-01-01
A simulation exercise of a labor-management dispute is used to teach psychology students some of the basics of descriptive statistics. Using comparable data sets generated by the instructor, students work in small groups to develop a statistical presentation that supports their particular position in the dispute. (Author/RM)
Tzonev, Svilen
2018-01-01
Current commercially available digital PCR (dPCR) systems and assays are capable of detecting individual target molecules with considerable reliability. As tests are developed and validated for use on clinical samples, the need to understand and develop robust statistical analysis routines increases. This chapter covers the fundamental processes and limitations of detecting and reporting on single molecule detection. We cover the basics of quantification of targets and sources of imprecision. We describe the basic test concepts: sensitivity, specificity, limit of blank, limit of detection, and limit of quantification in the context of dPCR. We provide basic guidelines how to determine those, how to choose and interpret the operating point, and what factors may influence overall test performance in practice.
Nurses' foot care activities in home health care.
Stolt, Minna; Suhonen, Riitta; Puukka, Pauli; Viitanen, Matti; Voutilainen, Päivi; Leino-Kilpi, Helena
2013-01-01
This study described the basic foot care activities performed by nurses and factors associated with these in the home care of older people. Data were collected from nurses (n=322) working in nine public home care agencies in Finland using the Nurses' Foot Care Activities Questionnaire (NFAQ). Data were analyzed statistically using descriptive statistics and multivariate liner models. Although some of the basic foot care activities of nurses reported using were outdated, the majority of foot care activities were consistent with recommendations in foot care literature. Longer working experience, referring patients with foot problems to a podiatrist and physiotherapist, and patient education in wart and nail care were associated with a high score for adequate foot care activities. Continuing education should focus on updating basic foot care activities and increasing the use of evidence-based foot care methods. Also, geriatric nursing research should focus in intervention research to improve the use of evidence-based basic foot care activities. Copyright © 2013 Mosby, Inc. All rights reserved.
α -induced reactions on 115In: Cross section measurements and statistical model analysis
NASA Astrophysics Data System (ADS)
Kiss, G. G.; Szücs, T.; Mohr, P.; Török, Zs.; Huszánk, R.; Gyürky, Gy.; Fülöp, Zs.
2018-05-01
Background: α -nucleus optical potentials are basic ingredients of statistical model calculations used in nucleosynthesis simulations. While the nucleon+nucleus optical potential is fairly well known, for the α +nucleus optical potential several different parameter sets exist and large deviations, reaching sometimes even an order of magnitude, are found between the cross section predictions calculated using different parameter sets. Purpose: A measurement of the radiative α -capture and the α -induced reaction cross sections on the nucleus 115In at low energies allows a stringent test of statistical model predictions. Since experimental data are scarce in this mass region, this measurement can be an important input to test the global applicability of α +nucleus optical model potentials and further ingredients of the statistical model. Methods: The reaction cross sections were measured by means of the activation method. The produced activities were determined by off-line detection of the γ rays and characteristic x rays emitted during the electron capture decay of the produced Sb isotopes. The 115In(α ,γ )119Sb and 115In(α ,n )Sb118m reaction cross sections were measured between Ec .m .=8.83 and 15.58 MeV, and the 115In(α ,n )Sb118g reaction was studied between Ec .m .=11.10 and 15.58 MeV. The theoretical analysis was performed within the statistical model. Results: The simultaneous measurement of the (α ,γ ) and (α ,n ) cross sections allowed us to determine a best-fit combination of all parameters for the statistical model. The α +nucleus optical potential is identified as the most important input for the statistical model. The best fit is obtained for the new Atomki-V1 potential, and good reproduction of the experimental data is also achieved for the first version of the Demetriou potentials and the simple McFadden-Satchler potential. The nucleon optical potential, the γ -ray strength function, and the level density parametrization are also constrained by the data although there is no unique best-fit combination. Conclusions: The best-fit calculations allow us to extrapolate the low-energy (α ,γ ) cross section of 115In to the astrophysical Gamow window with reasonable uncertainties. However, still further improvements of the α -nucleus potential are required for a global description of elastic (α ,α ) scattering and α -induced reactions in a wide range of masses and energies.
An Intercomparison of the Dynamical Cores of Global Atmospheric Circulation Models for Mars
NASA Technical Reports Server (NTRS)
Hollingsworth, Jeffery L.; Bridger, Alison F. C.; Haberle, Robert M.
1998-01-01
This is a Final Report for a Joint Research Interchange (JRI) between NASA Ames Research Center and San Jose State University, Department of Meteorology. The focus of this JRI has been to evaluate the dynamical 'cores' of two global atmospheric circulation models for Mars that are in operation at the NASA Ames Research Center. The two global circulation models in use are fundamentally different: one uses spherical harmonics in its horizontal representation of field variables; the other uses finite differences on a uniform longitude-latitude grid. Several simulations have been conducted to assess how the dynamical processors of each of these circulation models perform using identical 'simple physics' parameterizations. A variety of climate statistics (e.g., time-mean flows and eddy fields) have been compared for realistic solstitial mean basic states. Results of this research have demonstrated that the two Mars circulation models with completely different spatial representations and discretizations produce rather similar circulation statistics for first-order meteorological fields, suggestive of a tendency for convergence of numerical solutions. Second and higher-order fields can, however, vary significantly between the two models.
An Intercomparison of the Dynamical Cores of Global Atmospheric Circulation Models for Mars
NASA Technical Reports Server (NTRS)
Hollingsworth, Jeffery L.; Bridger, Alison F. C.; Haberle, Robert M.
1998-01-01
This is a Final Report for a Joint Research Interchange (JRI) between NASA Ames Research Cen- ter and San Jose State University, Department of Meteorology. The focus of this JRI has been to evaluate the dynamical "cores" of two global atmospheric circulation models for Mars that are in operation at the NASA Ames Research Center. ne two global circulation models in use are fundamentally different: one uses spherical harmonics in its horizontal representation of field variables; the other uses finite differences on a uniform longitude-latitude grid. Several simulations have been conducted to assess how the dynamical processors of each of these circulation models perform using identical "simple physics" parameterizations. A variety of climate statistics (e.g., time-mean flows and eddy fields) have been compared for realistic solstitial mean basic states. Results of this research have demonstrated that the two Mars circulation models with completely different spatial representations and discretizations produce rather similar circulation statistics for first-order meteorological fields, suggestive of a tendency for convergence of numerical solutions. Second and higher-order fields can, however, vary significantly between the two models.
Statistical analysis of the ambiguities in the asteroid period determinations
NASA Astrophysics Data System (ADS)
Butkiewicz-Bąk, M.; Kwiatkowski, T.; Bartczak, P.; Dudziński, G.; Marciniak, A.
2017-09-01
Among asteroids there exist ambiguities in their rotation period determinations. They are due to incomplete coverage of the rotation, noise and/or aliases resulting from gaps between separate lightcurves. To help to remove such uncertainties, basic characteristic of the lightcurves resulting from constraints imposed by the asteroid shapes and geometries of observations should be identified. We simulated light variations of asteroids whose shapes were modelled as Gaussian random spheres, with random orientations of spin vectors and phase angles changed every 5° from 0° to 65°. This produced 1.4 million lightcurves. For each simulated lightcurve, Fourier analysis has been made and the harmonic of the highest amplitude was recorded. From the statistical point of view, all lightcurves observed at phase angles α < 30°, with peak-to-peak amplitudes A > 0.2 mag, are bimodal. Second most frequently dominating harmonic is the first one, with the 3rd harmonic following right after. For 1 per cent of lightcurves with amplitudes A < 0.1 mag and phase angles α < 40°, 4th harmonic dominates.
Abdrakhmanov, Sarsenbay K; Beisembayev, Kanatzhan K; Кorennoy, Fedor I; Yessembekova, Gulzhan N; Кushubaev, Dosym B; Кadyrov, Ablaikhan S
2016-05-31
This study estimated the basic reproductive ratio of rabies at the population level in wild animals (foxes), farm animals (cattle, camels, horses, sheep) and what we classified as domestic animals (cats, dogs) in the Republic of Kazakhstan (RK). It also aimed at forecasting the possible number of new outbreaks in case of emergence of the disease in new territories. We considered cases of rabies in animals in RK from 2010 to 2013, recorded by regional veterinary services. Statistically significant space-time clusters of outbreaks in three subpopulations were detected by means of Kulldorff Scan statistics. Theoretical curves were then fitted to epidemiological data within each cluster assuming exponential initial growth, which was followed up by calculation of the basic reproductive ratio R0. For farm animals, the value of R0 was 1.62 (1.11-2.26) and for wild animals 1.84 (1.08- 3.13), while it was close to 1 for domestic animals. Using the values obtained, an initial phase of possible epidemic was simulated in order to predict the expected number of secondary cases if the disease were introduced into a new area. The possible number of new cases for 20 weeks was estimated at 5 (1-16) for farm animals, 17 (1-113) for wild animals and about 1 in the category of domestic animals. These results have been used to produce set of recommendations for organising of preventive and contra-epizootic measures against rabies expected to be applied by state veterinarian services.
Han, Xueying; Williams, Sharon R; Zuckerman, Brian L
2018-01-01
The translation of biomedical research from basic knowledge to application has been a priority at the National Institute of Health (NIH) for many years. Tracking the progress of scientific research and knowledge through the translational process is difficult due to variation in the definition of translational research as well as the identification of benchmarks for the spread and application of biomedical research; quantitatively tracking this process is even more difficult. Using a simple and reproducible method to assess whether publications are translational, we examined NIH R01 behavioral and social science research (BSSR) awards funded between 2008 and 2014 to determine whether there are differences in the percent of translational research publications produced by basic and applied research awards. We also assessed the percent of translational research publications produced by the Clinical and Translational Science Awards (CTSA) program to evaluate whether targeted translational research awards result in increased translational research. We found that 3.9% of publications produced by basic research awards were translational; that the percent of translational research publications produced by applied research awards is approximately double that of basic research awards (7.4%); and that targeted translational research awards from the CTSA program produced the highest percentage of translational research publications (13.4%). In addition, we assessed differences in time to first publication, time to first citation, and publication quality by award type (basic vs. applied), and whether an award (or publication) is translational.
Williams, Sharon R.; Zuckerman, Brian L.
2018-01-01
The translation of biomedical research from basic knowledge to application has been a priority at the National Institute of Health (NIH) for many years. Tracking the progress of scientific research and knowledge through the translational process is difficult due to variation in the definition of translational research as well as the identification of benchmarks for the spread and application of biomedical research; quantitatively tracking this process is even more difficult. Using a simple and reproducible method to assess whether publications are translational, we examined NIH R01 behavioral and social science research (BSSR) awards funded between 2008 and 2014 to determine whether there are differences in the percent of translational research publications produced by basic and applied research awards. We also assessed the percent of translational research publications produced by the Clinical and Translational Science Awards (CTSA) program to evaluate whether targeted translational research awards result in increased translational research. We found that 3.9% of publications produced by basic research awards were translational; that the percent of translational research publications produced by applied research awards is approximately double that of basic research awards (7.4%); and that targeted translational research awards from the CTSA program produced the highest percentage of translational research publications (13.4%). In addition, we assessed differences in time to first publication, time to first citation, and publication quality by award type (basic vs. applied), and whether an award (or publication) is translational. PMID:29742129
Developing Competency of Teachers in Basic Education Schools
ERIC Educational Resources Information Center
Yuayai, Rerngrit; Chansirisira, Pacharawit; Numnaphol, Kochaporn
2015-01-01
This study aims to develop competency of teachers in basic education schools. The research instruments included the semi-structured in-depth interview form, questionnaire, program developing competency, and evaluation competency form. The statistics used for data analysis were percentage, mean, and standard deviation. The research found that…
Semantic memory: a feature-based analysis and new norms for Italian.
Montefinese, Maria; Ambrosini, Ettore; Fairfield, Beth; Mammarella, Nicola
2013-06-01
Semantic norms for properties produced by native speakers are valuable tools for researchers interested in the structure of semantic memory and in category-specific semantic deficits in individuals following brain damage. The aims of this study were threefold. First, we sought to extend existing semantic norms by adopting an empirical approach to category (Exp. 1) and concept (Exp. 2) selection, in order to obtain a more representative set of semantic memory features. Second, we extensively outlined a new set of semantic production norms collected from Italian native speakers for 120 artifactual and natural basic-level concepts, using numerous measures and statistics following a feature-listing task (Exp. 3b). Finally, we aimed to create a new publicly accessible database, since only a few existing databases are publicly available online.
ERIC Educational Resources Information Center
DETERLINE, WILLIAM A.
A PROGRAMED COURSE IN METHODS AND TECHNIQUES OF PREPARING PROGRAMED INSTRUCTIONAL MATERIALS WAS PRESENTED IN THIS DOCUMENT. AN ATTEMPT WAS MADE TO TEACH BASIC PROCEDURES WELL ENOUGH TO PRODUCE AN EMBRYO PROGRAMER AND TO PROVIDE HIM WITH REFERENCES HE WOULD NEED IN ORDER TO PRODUCE PROGRAMS. INCLUDED WERE PROGRAMED INSTRUCTIONS ON PREPARATORY…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working the elementary schools of Cordoba, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Narino, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Cauca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Caldas, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Boyaca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Huila, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.
This report is a part of the program of the National Center for Health Statistics to provide current statistics as baseline data for the evaluation, planning, and administration of health programs. Part I presents data concerning the occupational fields: (1) administration, (2) anthropology and sociology, (3) data processing, (4) basic sciences,…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teacher personnel working in Colombian elementary schools between 1940 and 1968. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of teachers. (VM)
Explorations in Statistics: Standard Deviations and Standard Errors
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2008-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This series in "Advances in Physiology Education" provides an opportunity to do just that: we will investigate basic concepts in statistics using the free software package R. Because this series uses R solely as a vehicle…
ERIC Educational Resources Information Center
Cassel, Russell N.
This paper relates educational and psychological statistics to certain "Research Statistical Tools" (RSTs) necessary to accomplish and understand general research in the behavioral sciences. Emphasis is placed on acquiring an effective understanding of the RSTs and to this end they are are ordered to a continuum scale in terms of individual…
Estimates of School Statistics, 1971-72.
ERIC Educational Resources Information Center
Flanigan, Jean M.
This report presents public school statistics for the 50 States, the District of Columbia, and the regions and outlying areas of the United States. The text presents national data for each of the past 10 years and defines the basic series of statistics. Tables present the revised estimates by State and region for 1970-71 and the preliminary…
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876
Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.
Senior Computational Scientist | Center for Cancer Research
The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results
Brennan, Jennifer Sousa
2010-01-01
This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.
System analysis for the Huntsville Operational Support Center distributed computer system
NASA Technical Reports Server (NTRS)
Ingels, E. M.
1983-01-01
A simulation model was developed and programmed in three languages BASIC, PASCAL, and SLAM. Two of the programs are included in this report, the BASIC and the PASCAL language programs. SLAM is not supported by NASA/MSFC facilities and hence was not included. The statistical comparison of simulations of the same HOSC system configurations are in good agreement and are in agreement with the operational statistics of HOSC that were obtained. Three variations of the most recent HOSC configuration was run and some conclusions drawn as to the system performance under these variations.
NASA Astrophysics Data System (ADS)
Haven, Emmanuel; Khrennikov, Andrei
2013-01-01
Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.
NASA Astrophysics Data System (ADS)
Cardall, Christian Y.; Budiardja, Reuben D.
2017-05-01
GenASiS Basics provides Fortran 2003 classes furnishing extensible object-oriented utilitarian functionality for large-scale physics simulations on distributed memory supercomputers. This functionality includes physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. This revision -Version 2 of Basics - makes mostly minor additions to functionality and includes some simplifying name changes.
Basic Facts and Figures about the Educational System in Japan.
ERIC Educational Resources Information Center
National Inst. for Educational Research, Tokyo (Japan).
Tables, charts, and graphs convey supporting data that accompany text on various aspects of the Japanese educational system presented in this booklet. There are seven chapters: (1) Fundamental principles of education; (2) Organization of the educational system; (3) Basic statistics of education; (4) Curricula, textbooks, and instructional aids;…
Basic Understanding of Earth Tunneling by Melting : Volume 1. Basic Physical Principles.
DOT National Transportation Integrated Search
1974-07-01
A novel technique, which employs the melting of rocks and soils as a means of excavating or tunneling while simultaneously generating a glass tunnel lining and/or primary support, was studied. The object of the study was to produce a good basic under...
Guide to preparing SAND reports. Revised
DOE Office of Scientific and Technical Information (OSTI.GOV)
Locke, T.K.
1996-04-01
This guide contains basic information needed to produce a SAND report. Its guidelines reflect DOE regulation and Sandia policy. The guide includes basic writing instructions in an annotated sample report; guidance for organization, format, and layout of reports produced by line organizations; and information about conference papers, journal articles, and brochures. The appendixes contain sections on Sandia`s preferred usage, equations, references, copyrights and permissions, and publishing terms.
Annual statistical report 2008 : based on data from CARE/EC
DOT National Transportation Integrated Search
2008-10-31
This Annual Statistical Report provides the basic characteristics of road accidents in 19 member states of : the European Union for the period 1997-2006, on the basis of data collected and processed in the CARE : database, the Community Road Accident...
Country Education Profiles: Algeria.
ERIC Educational Resources Information Center
International Bureau of Education, Geneva (Switzerland).
One of a series of profiles prepared by the Cooperative Educational Abstracting Service, this brief outline provides basic background information on educational principles, system of administration, structure and organization, curricula, and teacher training in Algeria. Statistics provided by the Unesco Office of Statistics show enrollment at all…
78 FR 23158 - Organization and Delegation of Duties
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-18
... management actions of major significance, such as those relating to changes in basic organization pattern... regard to rulemaking, enforcement, vehicle safety research and statistics and data analysis, provides... Administrator for the National Center for Statistics and Analysis, and the Associate Administrator for Vehicle...
ERIC Educational Resources Information Center
Hobden, Sally
2014-01-01
Information on the HIV/AIDS epidemic in Southern Africa is often interpreted through a veil of secrecy and shame and, I argue, with flawed understanding of basic statistics. This research determined the levels of statistical literacy evident in 316 future Mathematical Literacy teachers' explanations of the median in the context of HIV/AIDS…
Introduction to Statistics. Learning Packages in the Policy Sciences Series, PS-26. Revised Edition.
ERIC Educational Resources Information Center
Policy Studies Associates, Croton-on-Hudson, NY.
The primary objective of this booklet is to introduce students to basic statistical skills that are useful in the analysis of public policy data. A few, selected statistical methods are presented, and theory is not emphasized. Chapter 1 provides instruction for using tables, bar graphs, bar graphs with grouped data, trend lines, pie diagrams,…
NASA Astrophysics Data System (ADS)
Bouchaud, Elisabeth; Soukiassian, Patrick
2009-11-01
Although fracture is a very common experience in every day life, it still harbours many unanswered questions. New avenues of investigation arise concerning the basic mechanisms leading to deformation and failure in heterogeneous materials, particularly in non-metals. The processes involved are even more complex when plasticity, thermal fluctuations or chemical interactions between the material and its environment introduce a specific time scale. Sub-critical failure, which may be reached at unexpectedly low loads, is particularly important for silicate glasses. Another source of complications originates from dynamic fracture, when loading rates become so high that the acoustic waves produced by the crack interact with the material heterogeneities, in turn producing new waves that modify the propagation. Recent progress in experimental techniques, allowing one to test and probe materials at sufficiently small length or time scales or in three dimensions, has led to a quantitative understanding of the physical processes involved. In parallel, simulations have also progressed, by extending the time and length scales they are able to reach, and thus attaining experimentally accessible conditions. However, one central question remains the inclusion of these basic mechanisms into a statistical description. This is not an easy task, mostly because of the strong stress gradients present at the tip of a crack, and because the averaging of fracture properties over a heterogeneous material, containing more or less brittle phases, requires rare event statistics. Substantial progress has been made in models and simulations based on accurate experiments. From these models, scaling laws have been derived, linking the behaviour at a micro- or even nano-scale to the macroscopic and even to geophysical scales. The reviews in this Cluster Issue of Journal of Physics D: Applied Physics cover several of these important topics, including the physical processes in fracture mechanisms, the sub-critical failure issue, the dynamical fracture propagation, and the scaling laws from the micro- to the geophysical scales. Achievements and progress are reported, and the many open questions are discussed, which should provide a sound basis for present and future prospects.
Li, Zhen-shan; Fu, Hui-zhen; Qu, Xiao-yan
2011-09-15
Reliable and accurate determinations of the quantities and composition of wastes is required for the planning of municipal solid waste (MSW) management systems. A model, based on the interrelationships of expenditure on consumer goods, time distribution, daily activities, residents groups, and waste generation, was developed and employed to estimate MSW generation by different activities and resident groups in Beijing. The principle is that MSW is produced by consumption of consumer goods by residents in their daily activities: 'Maintenance' (meeting the basic needs of food, housing and personal care), 'Subsistence' (providing the financial requirements) and 'Leisure' (social and recreational pursuits) activities. Three series of important parameters - waste generation per unit of consumer expenditure, consumer expenditure distribution to activities in unit time, and time assignment to activities by different resident groups - were determined using a statistical analysis, a sampling survey and the Analytic Hierarchy Process, respectively. Data for analysis were obtained from the Beijing Statistical Yearbook (2004-2008) and questionnaire survey. The results reveal that 'Maintenance' activity produced the most MSW, distantly followed by 'Leisure' and 'Subsistence' activities. In 2008, in descending order of MSW generation the different resident groups were floating population, non-civil servants, retired people, civil servants, college students (including both undergraduates and graduates), primary and secondary students, and preschoolers. The new estimation model, which was successful in fitting waste generation by different activities and resident groups over the investigated years, was amenable to MSW prediction. Copyright © 2011 Elsevier B.V. All rights reserved.
75 FR 33203 - Funding Formula for Grants to States
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-11
... as Social Security numbers, birth dates, and medical data. Docket: To read or download submissions or... Local Area Unemployment Statistics (LAUS), both of which are compiled by DOL's Bureau of Labor Statistics. Specifies how each State's basic JVSG allocation is calculated. Identifies the procedures...
Statistical Considerations for Establishing CBTE Cut-Off Scores.
ERIC Educational Resources Information Center
Trzasko, Joseph A.
This report gives the basic definition and purpose of competency-based teacher education (CBTE) cut-off scores. It describes the basic characteristics of CBTE as a yes-no dichotomous decision regarding the presence of a specific ability or knowledge, which necesitates the establishment of a cut-off point to designate competency vs. incompetency on…
ADULT BASIC EDUCATION. PROGRAM SUMMARY.
ERIC Educational Resources Information Center
Office of Education (DHEW), Washington, DC.
A BRIEF DESCRIPTION IS GIVEN OF THE FEDERAL ADULT BASIC EDUCATION PROGRAM, UNDER THE ADULT EDUCATION ACT OF 1966, AT THE NATIONAL AND STATE LEVELS (INCLUDING PUERTO RICO, GUAM, AMERICAN SAMOA, AND THE VIRGIN ISLANDS) AS PROVIDED BY STATE EDUCATION AGENCIES. STATISTICS FOR FISCAL YEARS 1965 AND 1966, AND ESTIMATES FOR FISCAL YEAR 1967, INDICATE…
Action Research of Computer-Assisted-Remediation of Basic Research Concepts.
ERIC Educational Resources Information Center
Packard, Abbot L.; And Others
This study investigated the possibility of creating a computer-assisted remediation program to assist students having difficulties in basic college research and statistics courses. A team approach involving instructors and students drove the research into and creation of the computer program. The effect of student use was reviewed by looking at…
Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.
ERIC Educational Resources Information Center
Blakeslee, David W.; And Others
This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-D-0419... who conduct studies using active controls and have a basic understanding of statistical principles... clinical investigators who conduct studies using active controls and have a basic understanding of...
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
Peers versus professional training of basic life support in Syria: a randomized controlled trial.
Abbas, Fatima; Sawaf, Bisher; Hanafi, Ibrahem; Hajeer, Mohammad Younis; Zakaria, Mhd Ismael; Abbas, Wafaa; Alabdeh, Fadi; Ibrahim, Nazir
2018-06-18
Peer training has been identified as a useful tool for delivering undergraduate training in basic life support (BLS) which is fundamental as an initial response in cases of emergency. This study aimed to (1) Evaluate the efficacy of peer-led model in basic life support training among medical students in their first three years of study, compared to professional-led training and (2) To assess the efficacy of the course program and students' satisfaction of peer-led training. A randomized controlled trial with blinded assessors was conducted on 72 medical students from the pre-clinical years (1st to 3rd years in Syria) at Syrian Private University. Students were randomly assigned to peer-led or to professional-led training group for one-day-course of basic life support skills. Sixty-four students who underwent checklist based assessment using objective structured clinical examination design (OSCE) (practical assessment of BLS skills) and answered BLS knowledge checkpoint-questionnaire were included in the analysis. There was no statistically significant difference between the two groups in delivering BLS skills to medical students in practical (P = 0.850) and BLS knowledge questionnaire outcomes (P = 0.900). Both groups showed statistically significant improvement from pre- to post-course assessment with significant statistical difference in both practical skills and theoretical knowledge (P-Value < 0.001). Students were satisfied with the peer model of training. Peer-led training of basic life support for medical students was beneficial and it provided a quality of education which was as effective as training conducted by professionals. This method is applicable and desirable especially in poor-resource countries and in crisis situation.
Moyo, Nomaqhawe; Madzimbamuto, Farai D; Shumbairerwa, Samson
2016-01-28
The current gold standard treatment for acute postoperative pain after major abdominal surgery is multimodal analgesia using patient controlled analgesia delivery systems. Patient controlled analgesia systems are expensive and their routine use in very low income countries is not practical. The use of ultrasound in anaesthesia has made some regional anaesthesia blocks technically easy and safe to perform. This study aimed to determine whether adding an ultrasound guided transversus abdominis plane block as an adjunct to the current parenteral opioid based regimen would result in superior pain relief after a trans abdominal hysterectomy compared to using parenteral opioids alone. Thirty-two elective patients having trans abdominal hysterectomy were recruited into a prospective randomised double-blind, controlled study comparing a bilateral transversus abdominis plane block using 21 ml of 0.25% bupivacaine and 4.0 mg dexamethasone with a sham block containing 21 ml 0.9% saline. Sixteen patients were allocated to each group. Anaesthesia and postoperative analgesia was left to the attending anaesthetist's discretion. Primary outcome was visual analogue scale for pain at 2 h and 4 h. Secondary outcomes were time to first request for analgesia, visual analogue scale for comfort and bother. The data were analysed using the Statistical Package for Social Sciences (SPSS version 16). There was no statistically significant difference in the demographics of the two groups regarding weight, height, physical status and type of surgical incision. There was a statistically significant difference in visual analogue scale for pain at 4 h during movement with lower pain scales in the test group (p = 0.034). Women in the control group had an average pain free period of 56.8 min (median 56.5 min) before requesting a rescue analgesic compared to 116.5 min (median 103 min) in the study group. The between group difference in the average total analgesia duration was statistically significant at the 0.05 level (p = 0.005). The addition of a bupivacaine-dexamethasone transverse abdominis plane block to intramuscular opioid does produce superior acute post-operative pain relief following a hysterectomy. However a single-shot block has a limited duration of action, and we recommend a repeat block. Clinical trials registration was obtained PACTR201501000965252. http//www.pactr.org/ATMWeb/appmanager/atm/atmregistry?_nfpb=true&_windowLabel=BasicSearchUpdateController_1&BasicSearchUpdateController_1_actionOverride=%2Fpageflows%2Ftrial%2FbasicSearchUpdate%2FviewTrail&BasicSearchUpdateController_1id=965. The trial was registered on the 12th Dec 2014.
An adaptive approach to the dynamic allocation of buffer storage. M.S. Thesis
NASA Technical Reports Server (NTRS)
Crooke, S. C.
1970-01-01
Several strategies for the dynamic allocation of buffer storage are simulated and compared. The basic algorithms investigated, using actual statistics observed in the Univac 1108 EXEC 8 System, include the buddy method and the first-fit method. Modifications are made to the basic methods in an effort to improve and to measure allocation performance. A simulation model of an adaptive strategy is developed which permits interchanging the two different methods, the buddy and the first-fit methods with some modifications. Using an adaptive strategy, each method may be employed in the statistical environment in which its performance is superior to the other method.
Ultrasound Dopplerography of abdomen pathology using statistical computer programs
NASA Astrophysics Data System (ADS)
Dmitrieva, Irina V.; Arakelian, Sergei M.; Wapota, Alberto R. W.
1998-04-01
The modern ultrasound dopplerography give us the big possibilities in investigation of gemodynamical changes in all stages of abdomen pathology. Many of researches devoted to using of noninvasive methods in practical medicine. Now ultrasound Dopplerography is one of the basic one. We investigated 250 patients from 30 to 77 ages, including 149 men and 101 women. The basic diagnosis of all patients was the Ischaemic Pancreatitis. The Second diagnoses of pathology were the Ischaemic Disease of Heart, Gypertension, Atherosclerosis, Diabet, Vascular Disease of Extremities. We researched the abdominal aorta and her branches: Arteria Mesenterica Superior (AMS), truncus coeliacus (TC), arteria hepatica communis (AHC), arteria lienalis (AL). For investigation we use the following equipment: ACUSON 128 XP/10c, BIOMEDIC, GENERAL ELECTRIC (USA, Japan). We analyzed the following componetns of gemodynamical changes of abdominal vessels: index of pulsation, index of resistance, ratio of systol-dystol, speed of blood circulation. Statistical program included the following one: 'basic statistic's,' 'analytic program.' In conclusion we determined that the all gemodynamical components of abdominal vessels had considerable changes in abdominal ischaemia than in normal situation. Using the computer's program for definition degree of gemodynamical changes, we can recommend the individual plan of diagnostical and treatment program.
Resilience Among Students at the Basic Enlisted Submarine School
2016-12-01
reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well
A Simple Statistical Thermodynamics Experiment
ERIC Educational Resources Information Center
LoPresto, Michael C.
2010-01-01
Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…
Vetter, Thomas R
2017-11-01
Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
`New insight into statistical hydrology' preface to the special issue
NASA Astrophysics Data System (ADS)
Kochanek, Krzysztof
2018-04-01
Statistical methods are still the basic tool for investigating random, extreme events occurring in hydrosphere. On 21-22 September 2017, in Warsaw (Poland) the international workshop of the Statistical Hydrology (StaHy) 2017 took place under the auspices of the International Association of Hydrological Sciences. The authors of the presentations proposed to publish their research results in the Special Issue of the Acta Geophysica-`New Insight into Statistical Hydrology'. Five papers were selected for publication, touching on the most crucial issues of statistical methodology in hydrology.
Physics of negative absolute temperatures.
Abraham, Eitan; Penrose, Oliver
2017-01-01
Negative absolute temperatures were introduced into experimental physics by Purcell and Pound, who successfully applied this concept to nuclear spins; nevertheless, the concept has proved controversial: a recent article aroused considerable interest by its claim, based on a classical entropy formula (the "volume entropy") due to Gibbs, that negative temperatures violated basic principles of statistical thermodynamics. Here we give a thermodynamic analysis that confirms the negative-temperature interpretation of the Purcell-Pound experiments. We also examine the principal arguments that have been advanced against the negative temperature concept; we find that these arguments are not logically compelling, and moreover that the underlying "volume" entropy formula leads to predictions inconsistent with existing experimental results on nuclear spins. We conclude that, despite the counterarguments, negative absolute temperatures make good theoretical sense and did occur in the experiments designed to produce them.
Matrix population models from 20 studies of perennial plant populations
Ellis, Martha M.; Williams, Jennifer L.; Lesica, Peter; Bell, Timothy J.; Bierzychudek, Paulette; Bowles, Marlin; Crone, Elizabeth E.; Doak, Daniel F.; Ehrlen, Johan; Ellis-Adam, Albertine; McEachern, Kathryn; Ganesan, Rengaian; Latham, Penelope; Luijten, Sheila; Kaye, Thomas N.; Knight, Tiffany M.; Menges, Eric S.; Morris, William F.; den Nijs, Hans; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F.; Shelly, J. Stephen; Stanley, Amanda; Thorpe, Andrea; Tamara, Ticktin; Valverde, Teresa; Weekley, Carl W.
2012-01-01
Demographic transition matrices are one of the most commonly applied population models for both basic and applied ecological research. The relatively simple framework of these models and simple, easily interpretable summary statistics they produce have prompted the wide use of these models across an exceptionally broad range of taxa. Here, we provide annual transition matrices and observed stage structures/population sizes for 20 perennial plant species which have been the focal species for long-term demographic monitoring. These data were assembled as part of the "Testing Matrix Models" working group through the National Center for Ecological Analysis and Synthesis (NCEAS). In sum, these data represent 82 populations with >460 total population-years of data. It is our hope that making these data available will help promote and improve our ability to monitor and understand plant population dynamics.
Matrix population models from 20 studies of perennial plant populations
Ellis, Martha M.; Williams, Jennifer L.; Lesica, Peter; Bell, Timothy J.; Bierzychudek, Paulette; Bowles, Marlin; Crone, Elizabeth E.; Doak, Daniel F.; Ehrlen, Johan; Ellis-Adam, Albertine; McEachern, Kathryn; Ganesan, Rengaian; Latham, Penelope; Luijten, Sheila; Kaye, Thomas N.; Knight, Tiffany M.; Menges, Eric S.; Morris, William F.; den Nijs, Hans; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F.; Shelly, J. Stephen; Stanley, Amanda; Thorpe, Andrea; Tamara, Ticktin; Valverde, Teresa; Weekley, Carl W.
2012-01-01
Demographic transition matrices are one of the most commonly applied population models for both basic and applied ecological research. The relatively simple framework of these models and simple, easily interpretable summary statistics they produce have prompted the wide use of these models across an exceptionally broad range of taxa. Here, we provide annual transition matrices and observed stage structures/population sizes for 20 perennial plant species which have been the focal species for long-term demographic monitoring. These data were assembled as part of the 'Testing Matrix Models' working group through the National Center for Ecological Analysis and Synthesis (NCEAS). In sum, these data represent 82 populations with >460 total population-years of data. It is our hope that making these data available will help promote and improve our ability to monitor and understand plant population dynamics.
Přibil, Jiří; Přibilová, Anna; Frollo, Ivan
2018-04-05
This article compares open-air and whole-body magnetic resonance imaging (MRI) equipment working with a weak magnetic field as regards the methods of its generation, spectral properties of mechanical vibration and acoustic noise produced by gradient coils during the scanning process, and the measured noise intensity. These devices are used for non-invasive MRI reconstruction of the human vocal tract during phonation with simultaneous speech recording. In this case, the vibration and noise have negative influence on quality of speech signal. Two basic measurement experiments were performed within the paper: mapping sound pressure levels in the MRI device vicinity and picking up vibration and noise signals in the MRI scanning area. Spectral characteristics of these signals are then analyzed statistically and compared visually and numerically.
Method for making polysilsesquioxanes and organohydridosilanes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loy, Douglas A.; Rahimian, Kamyar
2001-01-01
A method for disproportionation of an oligohydridosiloxane to produce a polysilsesquioxane compound and an organohydridosilane compound when contacted with a basic catalyst. The basic catalyst can be a tetraalkylammonium hydroxide, an alkali metal hydroxide, and an alkali earth hydroxide. These basic catalysts are generally dissolved in an organic solvent for delivery. The hydroxide catalysts are attractive because many readily decompose by heating above 150.degree. C., thus being easily removed from the final materials. The oligohydridosiloxane is contacted with the basic catalyst under conditions effective to catalytically convert the oligohydridosiloxane into a polysilsesquioxane compound and an organohydridosilane compound. The reaction canmore » occur in either an inert or oxidative atmosphere and can occur without heating, at room temperature. Both polysilsesquioxane foams and gels of the formula (RSiO.sub.1.5).sub.n can be produced.« less
On prognostic models, artificial intelligence and censored observations.
Anand, S S; Hamilton, P W; Hughes, J G; Bell, D A
2001-03-01
The development of prognostic models for assisting medical practitioners with decision making is not a trivial task. Models need to possess a number of desirable characteristics and few, if any, current modelling approaches based on statistical or artificial intelligence can produce models that display all these characteristics. The inability of modelling techniques to provide truly useful models has led to interest in these models being purely academic in nature. This in turn has resulted in only a very small percentage of models that have been developed being deployed in practice. On the other hand, new modelling paradigms are being proposed continuously within the machine learning and statistical community and claims, often based on inadequate evaluation, being made on their superiority over traditional modelling methods. We believe that for new modelling approaches to deliver true net benefits over traditional techniques, an evaluation centric approach to their development is essential. In this paper we present such an evaluation centric approach to developing extensions to the basic k-nearest neighbour (k-NN) paradigm. We use standard statistical techniques to enhance the distance metric used and a framework based on evidence theory to obtain a prediction for the target example from the outcome of the retrieved exemplars. We refer to this new k-NN algorithm as Censored k-NN (Ck-NN). This reflects the enhancements made to k-NN that are aimed at providing a means for handling censored observations within k-NN.
Systematic comparisons between PRISM version 1.0.0, BAP, and CSMIP ground-motion processing
Kalkan, Erol; Stephens, Christopher
2017-02-23
A series of benchmark tests was run by comparing results of the Processing and Review Interface for Strong Motion data (PRISM) software version 1.0.0 to Basic Strong-Motion Accelerogram Processing Software (BAP; Converse and Brady, 1992), and to California Strong Motion Instrumentation Program (CSMIP) processing (Shakal and others, 2003, 2004). These tests were performed by using the MatLAB implementation of PRISM, which is equivalent to its public release version in Java language. Systematic comparisons were made in time and frequency domains of records processed in PRISM and BAP, and in CSMIP, by using a set of representative input motions with varying resolutions, frequency content, and amplitudes. Although the details of strong-motion records vary among the processing procedures, there are only minor differences among the waveforms for each component and within the frequency passband common to these procedures. A comprehensive statistical evaluation considering more than 1,800 ground-motion components demonstrates that differences in peak amplitudes of acceleration, velocity, and displacement time series obtained from PRISM and CSMIP processing are equal to or less than 4 percent for 99 percent of the data, and equal to or less than 2 percent for 96 percent of the data. Other statistical measures, including the Euclidian distance (L2 norm) and the windowed root mean square level of processed time series, also indicate that both processing schemes produce statistically similar products.
The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis
ERIC Educational Resources Information Center
Buri, Olga Elizabeth Minchala; Stefos, Efstathios
2017-01-01
The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…
Improving Attendance and Punctuality of FE Basic Skill Students through an Innovative Scheme
ERIC Educational Resources Information Center
Ade-Ojo, Gordon O.
2005-01-01
This paper reports the findings of a study set up to establish the impact of a particular scheme on the attendance and punctuality performance of a group of Basic Skills learners against the backdrop of various theoretical postulations on managing undesirable behavior. Data collected on learners' performance was subjected to statistical analysis…
ERIC Educational Resources Information Center
Applied Management Sciences, Inc., Silver Spring, MD.
The amount of misreporting of Veterans Administration (VA) benefits was assessed, along with the impact of misreporting on the Basic Educational Opportunity Grant (BEOG) program. Accurate financial information is need to determine appropriate awards. The analysis revealed: over 97% of VA beneficiaries misreported benefits; the total net loss to…
ERIC Educational Resources Information Center
Yingxiu, Yang
2006-01-01
Using statistical data on the implementing conditions of China's educational expenditure published by the state, this paper studies the Gini coefficient of the budget educational public expenditure per student in order to examine the concentration degree of the educational expenditure for China's basic education and analyze its balanced…
Ernest J. Gebhart
1980-01-01
Other members of this panel are going to reveal the basic statistics about the coal strip mining industry in Ohio so I will confine my remarks to the revegetation of the spoil banks. So it doesn't appear that Ohio confined its tree planting efforts to spoil banks alone, I will rely on a few statistics.
Idaho State University Statistical Portrait, Academic Year 1998-1999.
ERIC Educational Resources Information Center
Idaho State Univ., Pocatello. Office of Institutional Research.
This report provides basic statistical data for Idaho State University, and includes both point-of-time data as well as trend data. The information is divided into sections emphasizing students, programs, faculty and staff, finances, and physical facilities. Student data includes enrollment, geographical distribution, student/faculty ratios,…
Statistical Report. Fiscal Year 1995: September 1, 1994 - August 31, 1995.
ERIC Educational Resources Information Center
Texas Higher Education Coordinating Board, Austin.
This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1995. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1990-94 headcount data; headcount by classification, ethnic origin,…
Statistical Report. Fiscal Year 1994: September 1, 1993 - August 31, 1994.
ERIC Educational Resources Information Center
Texas Higher Education Coordinating Board, Austin.
This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1994. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1989-93 headcount data; headcount by classification, ethnic origin,…
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2011 CFR
2011-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2013 CFR
2013-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2014 CFR
2014-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2012 CFR
2012-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
Theoretical Frameworks for Math Fact Fluency
ERIC Educational Resources Information Center
Arnold, Katherine
2012-01-01
Recent education statistics indicate persistent low math scores for our nation's students. This drop in math proficiency includes deficits in basic number sense and automaticity of math facts. The decrease has been recorded across all grade levels with the elementary levels showing the greatest loss (National Center for Education Statistics,…
Basic Statistical Concepts and Methods for Earth Scientists
Olea, Ricardo A.
2008-01-01
INTRODUCTION Statistics is the science of collecting, analyzing, interpreting, modeling, and displaying masses of numerical data primarily for the characterization and understanding of incompletely known systems. Over the years, these objectives have lead to a fair amount of analytical work to achieve, substantiate, and guide descriptions and inferences.
NASA Astrophysics Data System (ADS)
Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.
2018-01-01
Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.
Waaramaa, Teija; Leisiö, Timo
2013-01-01
The present study focused on voice quality and the perception of the basic emotions from speech samples in cross-cultural conditions. It was examined whether voice quality, cultural, or language background, age, or gender were related to the identification of the emotions. Professional actors (n2) and actresses (n2) produced non-sense sentences (n32) and protracted vowels (n8) expressing the six basic emotions, interest, and a neutral emotional state. The impact of musical interests on the ability to distinguish between emotions or valence (on an axis positivity – neutrality – negativity) from voice samples was studied. Listening tests were conducted on location in five countries: Estonia, Finland, Russia, Sweden, and the USA with 50 randomly chosen participants (25 males and 25 females) in each country. The participants (total N = 250) completed a questionnaire eliciting their background information and musical interests. The responses in the listening test and the questionnaires were statistically analyzed. Voice quality parameters and the share of the emotions and valence identified correlated significantly with each other for both genders. The percentage of emotions and valence identified was clearly above the chance level in each of the five countries studied, however, the countries differed significantly from each other for the identified emotions and the gender of the speaker. The samples produced by females were identified significantly better than those produced by males. Listener's age was a significant variable. Only minor gender differences were found for the identification. Perceptual confusion in the listening test between emotions seemed to be dependent on their similar voice production types. Musical interests tended to have a positive effect on the identification of the emotions. The results also suggest that identifying emotions from speech samples may be easier for those listeners who share a similar language or cultural background with the speaker. PMID:23801972
ERIC Educational Resources Information Center
Rosmiati, Rosmiati; Mahmud, Alimuddin; Talib, Syamsul B.
2016-01-01
The purpose of this study was to determine the effectiveness of the basic education learning model with character-based through learning in the Universitas Muslim Indonesia. In addition, the research specifically examines the character of discipline, curiosity and responsibility. The specific target is to produce a basic education learning model…
ERIC Educational Resources Information Center
Evaluation and Training Inst., Los Angeles, CA.
This handbook was produced as a result of a project that studied California community college programs that teach basic skills in vocational education programs. The project included a literature review, a telephone survey, and 12 site visits. The handbook contains four sections: (1) steps for integrating basic skills and vocational instruction;…
Concrete pavement construction basics : tech notes.
DOT National Transportation Integrated Search
2006-08-01
This tech note has been produced for developers, consultants, and engineers planning concrete pavement construction projects, superintendents and supervisors who want a basic training aid and reference, and crew members new to the concrete paving ind...
Rebuilding Government Legitimacy in Post-conflict Societies: Case Studies of Nepal and Afghanistan
2015-09-09
administered via the verbal scales due to reduced time spent explaining the visual show cards. Statistical results corresponded with observations from...a three-step strategy for dealing with item non-response. First, basic descriptive statistics are calculated to determine the extent of item...descriptive statistics for all items in the survey), however this section of the report highlights just some of the findings. Thus, the results
Biostatistical and medical statistics graduate education
2014-01-01
The development of graduate education in biostatistics and medical statistics is discussed in the context of training within a medical center setting. The need for medical researchers to employ a wide variety of statistical designs in clinical, genetic, basic science and translational settings justifies the ongoing integration of biostatistical training into medical center educational settings and informs its content. The integration of large data issues are a challenge. PMID:24472088
Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763
Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.
Watersheds in disordered media
NASA Astrophysics Data System (ADS)
Andrade, Joséi, Jr.; Araújo, Nuno; Herrmann, Hans; Schrenk, Julian
2015-02-01
What is the best way to divide a rugged landscape? Since ancient times, watersheds separating adjacent water systems that flow, for example, toward different seas, have been used to delimit boundaries. Interestingly, serious and even tense border disputes between countries have relied on the subtle geometrical properties of these tortuous lines. For instance, slight and even anthropogenic modifications of landscapes can produce large changes in a watershed, and the effects can be highly nonlocal. Although the watershed concept arises naturally in geomorphology, where it plays a fundamental role in water management, landslide, and flood prevention, it also has important applications in seemingly unrelated fields such as image processing and medicine. Despite the far-reaching consequences of the scaling properties on watershed-related hydrological and political issues, it was only recently that a more profound and revealing connection has been disclosed between the concept of watershed and statistical physics of disordered systems. This review initially surveys the origin and definition of a watershed line in a geomorphological framework to subsequently introduce its basic geometrical and physical properties. Results on statistical properties of watersheds obtained from artificial model landscapes generated with long-range correlations are presented and shown to be in good qualitative and quantitative agreement with real landscapes.
How Persistent are Grammatical Gender Effects? The Case of German and Tamil.
Sedlmeier, Peter; Tipandjan, Arun; Jänchen, Anastasia
2016-04-01
Does the language we speak shape the way we think? The present research concentrated on the impact of grammatical gender on cognition and examined the persistence of the grammatical gender effect by (a) concentrating on German, a three-gendered language, for which previous results have been inconsistent, (b) statistically controlling for common alternative explanations, (c) employing three tasks that differed in how closely they are associated with grammatical gender, and (d) using Tamil, a nongendered language, as a baseline for comparison. We found a substantial grammatical gender effect for two commonly used tasks, even when alternative explanations were statistically controlled for. However, there was basically no effect for a task that was only very loosely connected to grammatical gender (similarity rating of word pairs). In contrast to previous studies that found effects of the German and Spanish grammatical gender in English (a nongendered language), our study did not produce such effects for Tamil, again after controlling for alternative explanations, which can be taken as additional evidence for the existence of a purely linguistic grammatical gender effect. These results indicate that general grammatical gender effects exist but that the size of these effects may be limited and their range restricted.
Walkabout the Galaxy: Podcasting for Informal and Accessible Astronomy Outreach and Education
NASA Astrophysics Data System (ADS)
Colwell, J. E.; Dove, A.; Kehoe, A.; Becker, T. M.
2014-12-01
"Walkabout the Galaxy" is a weekly podcast we have been publishing since May 2014 discussing astronomical news that is in the popular media at the time of recording. Episodes are 25-30 minutes in length and are informal in style: we emphasize one or two basic points while engaging in a free-form discussion of the topic with frequent tangential asides. The target audience is the interested layperson rather than a student, professional, or amateur of astronomy. The informal style is deliberately chosen to keep the podcast from sounding like a classroom lesson and to improve the reach of the podcast to a broader public. Guests have included both experts and laypeople. The number of episode downloads varies by nearly a factor of two from episode to episode (~450 to 750). We will present statistics on downloads and subscriptions, and correlations with episode length, subject matter, and style of episode title. The style of the content cannot influence download statistics, however, and it is not possible to track actual listenership data once the episodes are downloaded. We will discuss lessons learned in creating and producing an educational podcast as well as listener feedback.
An overview of meta-analysis for clinicians.
Lee, Young Ho
2018-03-01
The number of medical studies being published is increasing exponentially, and clinicians must routinely process large amounts of new information. Moreover, the results of individual studies are often insufficient to provide confident answers, as their results are not consistently reproducible. A meta-analysis is a statistical method for combining the results of different studies on the same topic and it may resolve conflicts among studies. Meta-analysis is being used increasingly and plays an important role in medical research. This review introduces the basic concepts, steps, advantages, and caveats of meta-analysis, to help clinicians understand it in clinical practice and research. A major advantage of a meta-analysis is that it produces a precise estimate of the effect size, with considerably increased statistical power, which is important when the power of the primary study is limited because of a small sample size. A meta-analysis may yield conclusive results when individual studies are inconclusive. Furthermore, meta-analyses investigate the source of variation and different effects among subgroups. In summary, a meta-analysis is an objective, quantitative method that provides less biased estimates on a specific topic. Understanding how to conduct a meta-analysis aids clinicians in the process of making clinical decisions.
Alternative Fuels Data Center: Hydrogen Basics
; Incentives Hydrogen Basics Hydrogen (H2) is an alternative fuel that can be produced from diverse domestic for domestic production, its fast filling time, and the fuel cell's high efficiency. In fact, a fuel
Views of medical students: what, when and how do they want statistics taught?
Fielding, S; Poobalan, A; Prescott, G J; Marais, D; Aucott, L
2015-11-01
A key skill for a practising clinician is being able to do research, understand the statistical analyses and interpret results in the medical literature. Basic statistics has become essential within medical education, but when, what and in which format is uncertain. To inform curriculum design/development we undertook a quantitative survey of fifth year medical students and followed them up with a series of focus groups to obtain their opinions as to what statistics teaching they want, when and how. A total of 145 students undertook the survey and five focus groups were held with between 3 and 9 participants each. Previous statistical training varied and students recognised their knowledge was inadequate and keen to see additional training implemented. Students were aware of the importance of statistics to their future careers, but apprehensive about learning. Face-to-face teaching supported by online resources was popular. Focus groups indicated the need for statistical training early in their degree and highlighted their lack of confidence and inconsistencies in support. The study found that the students see the importance of statistics training in the medical curriculum but that timing and mode of delivery are key. The findings have informed the design of a new course to be implemented in the third undergraduate year. Teaching will be based around published studies aiming to equip students with the basics required with additional resources available through a virtual learning environment. © The Author(s) 2015.
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Hawk, J. D.
1975-01-01
A generalized concept for cost-effective structural design is introduced. It is assumed that decisions affecting the cost effectiveness of aerospace structures fall into three basic categories: design, verification, and operation. Within these basic categories, certain decisions concerning items such as design configuration, safety factors, testing methods, and operational constraints are to be made. All or some of the variables affecting these decisions may be treated probabilistically. Bayesian statistical decision theory is used as the tool for determining the cost optimum decisions. A special case of the general problem is derived herein, and some very useful parametric curves are developed and applied to several sample structures.
On a Quantum Model of Brain Activities
NASA Astrophysics Data System (ADS)
Fichtner, K.-H.; Fichtner, L.; Freudenberg, W.; Ohya, M.
2010-01-01
One of the main activities of the brain is the recognition of signals. A first attempt to explain the process of recognition in terms of quantum statistics was given in [6]. Subsequently, details of the mathematical model were presented in a (still incomplete) series of papers (cf. [7, 2, 5, 10]). In the present note we want to give a general view of the principal ideas of this approach. We will introduce the basic spaces and justify the choice of spaces and operations. Further, we bring the model face to face with basic postulates any statistical model of the recognition process should fulfill. These postulates are in accordance with the opinion widely accepted in psychology and neurology.
Excoffier, L; Smouse, P E; Quattro, J M
1992-06-01
We present here a framework for the study of molecular variation within a single species. Information on DNA haplotype divergence is incorporated into an analysis of variance format, derived from a matrix of squared-distances among all pairs of haplotypes. This analysis of molecular variance (AMOVA) produces estimates of variance components and F-statistic analogs, designated here as phi-statistics, reflecting the correlation of haplotypic diversity at different levels of hierarchical subdivision. The method is flexible enough to accommodate several alternative input matrices, corresponding to different types of molecular data, as well as different types of evolutionary assumptions, without modifying the basic structure of the analysis. The significance of the variance components and phi-statistics is tested using a permutational approach, eliminating the normality assumption that is conventional for analysis of variance but inappropriate for molecular data. Application of AMOVA to human mitochondrial DNA haplotype data shows that population subdivisions are better resolved when some measure of molecular differences among haplotypes is introduced into the analysis. At the intraspecific level, however, the additional information provided by knowing the exact phylogenetic relations among haplotypes or by a nonlinear translation of restriction-site change into nucleotide diversity does not significantly modify the inferred population genetic structure. Monte Carlo studies show that site sampling does not fundamentally affect the significance of the molecular variance components. The AMOVA treatment is easily extended in several different directions and it constitutes a coherent and flexible framework for the statistical analysis of molecular data.
ERIC Educational Resources Information Center
Cunningham, Phyllis M.
Intending to explore the interaction effects of self-esteem level and perceived program utility on the retention and cognitive achievement of adult basic education students, a self-esteem instrument, to be administered verbally, was constructed with content relevant items developed from and tested on a working class, undereducated, black, adult…
ERIC Educational Resources Information Center
Tighe, Elizabeth L.; Schatschneider, Christopher
2016-01-01
The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological…
A survey of the state-of-the-art and focused research in range systems
NASA Technical Reports Server (NTRS)
Kung, Yao; Balakrishnan, A. V.
1988-01-01
In this one-year renewal of NASA Contract No. 2-304, basic research, development, and implementation in the areas of modern estimation algorithms and digital communication systems have been performed. In the first area, basic study on the conversion of general classes of practical signal processing algorithms into systolic array algorithms is considered, producing four publications. Also studied were the finite word length effects and convergence rates of lattice algorithms, producing two publications. In the second area of study, the use of efficient importance sampling simulation technique for the evaluation of digital communication system performances were studied, producing two publications.
Summary Statistics of CPB-Qualified Public Radio Stations: Fiscal Year 1971.
ERIC Educational Resources Information Center
Lee, S. Young; Pedone, Ronald J.
Basic statistics on finance, employment, and broadcast and production activities of 103 Corporation for Public Broadcasting (CPB)--qualified radio stations in the United States and Puerto Rico for Fiscal Year 1971 are collected. The first section of the report deals with total funds, income, direct operating costs, capital expenditures, and other…
Using Statistics to Lie, Distort, and Abuse Data
ERIC Educational Resources Information Center
Bintz, William; Moore, Sara; Adams, Cheryll; Pierce, Rebecca
2009-01-01
Statistics is a branch of mathematics that involves organization, presentation, and interpretation of data, both quantitative and qualitative. Data do not lie, but people do. On the surface, quantitative data are basically inanimate objects, nothing more than lifeless and meaningless symbols that appear on a page, calculator, computer, or in one's…
What Software to Use in the Teaching of Mathematical Subjects?
ERIC Educational Resources Information Center
Berežný, Štefan
2015-01-01
We can consider two basic views, when using mathematical software in the teaching of mathematical subjects. First: How to learn to use specific software for the specific tasks, e. g., software Statistica for the subjects of Applied statistics, probability and mathematical statistics, or financial mathematics. Second: How to learn to use the…
Intrex Subject/Title Inverted-File Characteristics.
ERIC Educational Resources Information Center
Uemura, Syunsuke
The characteristics of the Intrex subject/title inverted file are analyzed. Basic statistics of the inverted file are presented including various distributions of the index words and terms from which the file was derived, and statistics on stems, the file growth process, and redundancy measurements. A study of stems both with extremely high and…
ERIC Educational Resources Information Center
Ramseyer, Gary C.; Tcheng, Tse-Kia
The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)
ERIC Educational Resources Information Center
Dexter, Franklin; Masursky, Danielle; Wachtel, Ruth E.; Nussmeier, Nancy A.
2010-01-01
Operating room (OR) management differs from clinical anesthesia in that statistical literacy is needed daily to make good decisions. Two of the authors teach a course in operations research for surgical services to anesthesiologists, anesthesia residents, OR nursing directors, hospital administration students, and analysts to provide them with the…
Statistics and Data Interpretation for Social Work
ERIC Educational Resources Information Center
Rosenthal, James A.
2011-01-01
Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…
Using Excel in Teacher Education for Sustainability
ERIC Educational Resources Information Center
Aydin, Serhat
2016-01-01
In this study, the feasibility of using Excel software in teaching whole Basic Statistics Course and its influence on the attitudes of pre-service science teachers towards statistics were investigated. One hundred and two pre-service science teachers in their second year participated in the study. The data were collected from the prospective…
Basic Math Skills and Performance in an Introductory Statistics Course
ERIC Educational Resources Information Center
Johnson, Marianne; Kuennen, Eric
2006-01-01
We identify the student characteristics most associated with success in an introductory business statistics class, placing special focus on the relationship between student math skills and course performance, as measured by student grade in the course. To determine which math skills are important for student success, we examine (1) whether the…
An Online Course of Business Statistics: The Proportion of Successful Students
ERIC Educational Resources Information Center
Pena-Sanchez, Rolando
2009-01-01
This article describes the students' academic progress in an online course of business statistics through interactive software assignments and diverse educational homework, which helps these students to build their own e-learning through basic competences; i.e. interpreting results and solving problems. Cross-tables were built for the categorical…
Wallace, Cynthia S.A.; Advised by Marsh, Stuart E.
2002-01-01
The research accomplished in this dissertation used both mathematical and statistical techniques to extract and evaluate measures of landscape temporal dynamics and spatial structure from remotely sensed data for the purpose of mapping wildlife habitat. By coupling the landscape measures gleaned from the remotely sensed data with various sets of animal sightings and population data, effective models of habitat preference were created.Measures of temporal dynamics of vegetation greenness as measured by National Oceanographic and Atmospheric Administration’s Advanced Very High Resolution Radiometer (AVHRR) satellite were used to effectively characterize and map season specific habitat of the Sonoran pronghorn antelope, as well as produce preliminary models of potential yellow-billed cuckoo habitat in Arizona. Various measures that capture different aspects of the temporal dynamics of the landscape were derived from AVHRR Normalized Difference Vegetation Index composite data using three main classes of calculations: basic statistics, standardized principal components analysis, and Fourier analysis. Pronghorn habitat models based on the AVHRR measures correspond visually and statistically to GIS-based models produced using data that represent detailed knowledge of ground-condition.Measures of temporal dynamics also revealed statistically significant correlations with annual estimates of elk population in selected Arizona Game Management Units, suggesting elk respond to regional environmental changes that can be measured using satellite data. Such relationships, once verified and established, can be used to help indirectly monitor the population.Measures of landscape spatial structure derived from IKONOS high spatial resolution (1-m) satellite data using geostatistics effectively map details of Sonoran pronghorn antelope habitat. Local estimates of the nugget, sill, and range variogram parameters calculated within 25 x 25-meter image windows describe the spatial autocorrelation of the image, permitting classification of all pixels into coherent units whose signature graphs exhibit a classic variogram shape. The variogram parameters captured in these signatures have been shown in previous studies to discriminate between different species-specific vegetation associations.The synoptic view of the landscape provided by satellite data can inform resource management efforts. The ability to characterize the spatial structure and temporal dynamics of habitat using repeatable remote sensing data allows closer monitoring of the relationship between a species and its landscape.
ERIC Educational Resources Information Center
Boiteau, Denise; Stansfield, David
This document describes mathematical programs on the basic concepts of algebra produced by Louisiana Public Broadcasting. Programs included are: (1) "Inverse Operations"; (2) "The Order of Operations"; (3) "Basic Properties" (addition and multiplication of numbers and variables); (4) "The Positive and Negative…
Health Literacy Impact on National Healthcare Utilization and Expenditure.
Rasu, Rafia S; Bawa, Walter Agbor; Suminski, Richard; Snella, Kathleen; Warady, Bradley
2015-08-17
Health literacy presents an enormous challenge in the delivery of effective healthcare and quality outcomes. We evaluated the impact of low health literacy (LHL) on healthcare utilization and healthcare expenditure. Database analysis used Medical Expenditure Panel Survey (MEPS) from 2005-2008 which provides nationally representative estimates of healthcare utilization and expenditure. Health literacy scores (HLSs) were calculated based on a validated, predictive model and were scored according to the National Assessment of Adult Literacy (NAAL). HLS ranged from 0-500. Health literacy level (HLL) and categorized in 2 groups: Below basic or basic (HLS <226) and above basic (HLS ≥226). Healthcare utilization expressed as a physician, nonphysician, or emergency room (ER) visits and healthcare spending. Expenditures were adjusted to 2010 rates using the Consumer Price Index (CPI). A P value of 0.05 or less was the criterion for statistical significance in all analyses. Multivariate regression models assessed the impact of the predicted HLLs on outpatient healthcare utilization and expenditures. All analyses were performed with SAS and STATA® 11.0 statistical software. The study evaluated 22 599 samples representing 503 374 648 weighted individuals nationally from 2005-2008. The cohort had an average age of 49 years and included more females (57%). Caucasian were the predominant racial ethnic group (83%) and 37% of the cohort were from the South region of the United States of America. The proportion of the cohort with basic or below basic health literacy was 22.4%. Annual predicted values of physician visits, nonphysician visits, and ER visits were 6.6, 4.8, and 0.2, respectively, for basic or below basic compared to 4.4, 2.6, and 0.1 for above basic. Predicted values of office and ER visits expenditures were $1284 and $151, respectively, for basic or below basic and $719 and $100 for above basic (P < .05). The extrapolated national estimates show that the annual costs for prescription alone for adults with LHL possibly associated with basic and below basic health literacy could potentially reach about $172 billion. Health literacy is inversely associated with healthcare utilization and expenditure. Individuals with below basic or basic HLL have greater healthcare utilization and expendituresspending more on prescriptions compared to individuals with above basic HLL. Public health strategies promoting appropriate education among individuals with LHL may help to improve health outcomes and reduce unnecessary healthcare visits and costs. © 2015 by Kerman University of Medical Sciences.
Somaraj, Vinej; Shenoy, Rekha P; Panchmal, Ganesh Shenoy; Jodalli, Praveen S; Sonde, Laxminarayan; Karkal, Ravichandra
2017-01-01
This cross-sectional study aimed to assess the knowledge, attitude and anxiety pertaining to basic life support (BLS) and medical emergencies among interns in dental colleges of Mangalore city, Karnataka, India. The study subjects comprised of interns who volunteered from the four dental colleges. The knowledge and attitude of interns were assessed using a 30-item questionnaire prepared based on the Basic Life Support Manual from American Heart Association and the anxiety of interns pertaining to BLS and medical emergencies were assessed using a State-Trait Anxiety Inventory (STAI) Questionnaire. Chi-square test was performed on SPSS 21.0 (IBM Statistics, 2012) to determine statistically significant differences ( P <0.05) between assessed knowledge and anxiety. Out of 183 interns, 39.89% had below average knowledge. A total of 123 (67.21%) reported unavailability of professional training. The majority (180, 98.36%) felt the urgent need of training in basic life support procedures. Assessment of stress showed a total of 27.1% participants to be above high-stress level. Comparison of assessed knowledge and stress was found to be insignificant ( P =0.983). There was an evident lack of knowledge pertaining to the management of medical emergencies among the interns. As oral health care providers moving out to the society, a focus should be placed on the training of dental interns with respect to Basic Life Support procedures.
Dunn, Thomas M; Dalton, Alice; Dorfman, Todd; Dunn, William W
2004-01-01
To be a first step in determining whether emergency medicine technician (EMT)-Basics are capable of using a protocol that allows for selective immobilization of the cervical spine. Such protocols are coming into use at an advanced life support level and could be beneficial when used by basic life support providers. A convenience sample of participants (n=95) from 11 emergency medical services agencies and one college class participated in the study. All participants evaluated six patients in written scenarios and decided which should be placed into spinal precautions according to a selective spinal immobilization protocol. Systems without an existing selective spinal immobilization protocol received a one-hour continuing education lecture regarding the topic. College students received a similar lecture written so laypersons could understand the protocol. All participants showed proficiency when applying a selective immobilization protocol to patients in paper-based scenarios. Furthermore, EMT-Basics performed at the same level as paramedics when following the protocol. Statistical analysis revealed no significant differences between EMT-Basics and paramedics. A follow-up group of college students (added to have a non-EMS comparison group) also performed as well as paramedics when making decisions to use spinal precautions. Differences between college students and paramedics were also statistically insignificant. The results suggest that EMT-Basics are as accurate as paramedics when making decisions regarding selective immobilization of the cervical spine during paper-based scenarios. That laypersons are also proficient when using the protocol could indicate that it is extremely simple to follow. This study is a first step toward the necessary additional studies evaluating the efficacy of EMT-Basics using selective immobilization as a regular practice.
Abowd, John M.; Vilhuber, Lars
2010-01-01
The Quarterly Workforce Indicators (QWI) are local labor market data produced and released every quarter by the United States Census Bureau. Unlike any other local labor market series produced in the U.S. or the rest of the world, QWI measure employment flows for workers (accession and separations), jobs (creations and destructions) and earnings for demographic subgroups (age and gender), economic industry (NAICS industry groups), detailed geography (block (experimental), county, Core-Based Statistical Area, and Workforce Investment Area), and ownership (private, all) with fully interacted publication tables. The current QWI data cover 47 states, about 98% of the private workforce in those states, and about 92% of all private employment in the entire economy. State participation is sufficiently extensive to permit us to present the first national estimates constructed from these data. We focus on worker, job, and excess (churning) reallocation rates, rather than on levels of the basic variables. This permits comparison to existing series from the Job Openings and Labor Turnover Survey and the Business Employment Dynamics Series from the Bureau of Labor Statistics (BLS). The national estimates from the QWI are an important enhancement to existing series because they include demographic and industry detail for both worker and job flow data compiled from underlying micro-data that have been integrated at the job and establishment levels by the Longitudinal Employer-Household Dynamics Program at the Census Bureau. The estimates presented herein were compiled exclusively from public-use data series and are available for download. PMID:21516213
Kim, Kiyeon; Omori, Ryosuke; Ito, Kimihito
2017-12-01
The estimation of the basic reproduction number is essential to understand epidemic dynamics, and time series data of infected individuals are usually used for the estimation. However, such data are not always available. Methods to estimate the basic reproduction number using genealogy constructed from nucleotide sequences of pathogens have been proposed so far. Here, we propose a new method to estimate epidemiological parameters of outbreaks using the time series change of Tajima's D statistic on the nucleotide sequences of pathogens. To relate the time evolution of Tajima's D to the number of infected individuals, we constructed a parsimonious mathematical model describing both the transmission process of pathogens among hosts and the evolutionary process of the pathogens. As a case study we applied this method to the field data of nucleotide sequences of pandemic influenza A (H1N1) 2009 viruses collected in Argentina. The Tajima's D-based method estimated basic reproduction number to be 1.55 with 95% highest posterior density (HPD) between 1.31 and 2.05, and the date of epidemic peak to be 10th July with 95% HPD between 22nd June and 9th August. The estimated basic reproduction number was consistent with estimation by birth-death skyline plot and estimation using the time series of the number of infected individuals. These results suggested that Tajima's D statistic on nucleotide sequences of pathogens could be useful to estimate epidemiological parameters of outbreaks. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
The Canadian Precipitation Analysis (CaPA): Evaluation of the statistical interpolation scheme
NASA Astrophysics Data System (ADS)
Evans, Andrea; Rasmussen, Peter; Fortin, Vincent
2013-04-01
CaPA (Canadian Precipitation Analysis) is a data assimilation system which employs statistical interpolation to combine observed precipitation with gridded precipitation fields produced by Environment Canada's Global Environmental Multiscale (GEM) climate model into a final gridded precipitation analysis. Precipitation is important in many fields and applications, including agricultural water management projects, flood control programs, and hydroelectric power generation planning. Precipitation is a key input to hydrological models, and there is a desire to have access to the best available information about precipitation in time and space. The principal goal of CaPA is to produce this type of information. In order to perform the necessary statistical interpolation, CaPA requires the estimation of a semi-variogram. This semi-variogram is used to describe the spatial correlations between precipitation innovations, defined as the observed precipitation amounts minus the GEM forecasted amounts predicted at the observation locations. Currently, CaPA uses a single isotropic variogram across the entire analysis domain. The present project investigates the implications of this choice by first conducting a basic variographic analysis of precipitation innovation data across the Canadian prairies, with specific interest in identifying and quantifying potential anisotropy within the domain. This focus is further expanded by identifying the effect of storm type on the variogram. The ultimate goal of the variographic analysis is to develop improved semi-variograms for CaPA that better capture the spatial complexities of precipitation over the Canadian prairies. CaPA presently applies a Box-Cox data transformation to both the observations and the GEM data, prior to the calculation of the innovations. The data transformation is necessary to satisfy the normal distribution assumption, but introduces a significant bias. The second part of the investigation aims at devising a bias correction scheme based on a moving-window averaging technique. For both the variogram and bias correction components of this investigation, a series of trial runs are conducted to evaluate the impact of these changes on the resulting CaPA precipitation analyses.
The effects of motion and g-seat cues on pilot simulator performance of three piloting tasks
NASA Technical Reports Server (NTRS)
Showalter, T. W.; Parris, B. L.
1980-01-01
Data are presented that show the effects of motion system cues, g-seat cues, and pilot experience on pilot performance during takeoffs with engine failures, during in-flight precision turns, and during landings with wind shear. Eight groups of USAF pilots flew a simulated KC-135 using four different cueing systems. The basic cueing system was a fixed-base type (no-motion cueing) with visual cueing. The other three systems were produced by the presence of either a motion system or a g-seat, or both. Extensive statistical analysis of the data was performed and representative performance means were examined. These data show that the addition of motion system cueing results in significant improvement in pilot performance for all three tasks; however, the use of g-seat cueing, either alone or in conjunction with the motion system, provides little if any performance improvement for these tasks and for this aircraft type.
2012-01-01
This article describes the sampling frames and basic data collection methods for petroleum price data reported by Energy Information Administration (EIA) and other Government agencies. In addition, it compares and contrasts annual average prices reported by EIA with comparable prices from the Bureau of Labor Statistics (BLS) CPI (Consumer Price Indexes) for the retail prices of residential No. 2 distillate, on-highway diesel fuel and motor gasoline (all grades.) Further, it compares refiner wholesale/resale prices for No. 2 fuel oil, No. 2 diesel fuel, motor gasoline (all grades,) kerosene-type jet fuel and residual fuel oil reported by EIA with comparable prices from the BLS PPI (Producer Price Index.) A discussion of the various crude oil prices and spot/futures prices published by EIA and other Government agencies is also included in the article.
How can the struggling research community adapt to the information age?
NASA Astrophysics Data System (ADS)
Bouma, Johan
2017-04-01
The widespread use of internet and social media has fundamentally changed the relationship of research with society culminating in :"fact-free politics". Rather than operate from the position of distant experts who are graciously willing to serve mankind, expecting gratitude and admiration in return, scientists encounter knowledgeable stakeholders realizing :"citizen science". Some see science as just producing :"yet another opinion". It is time now to re-establish and advocate the basic power of the scientific effort, involving stakeholders systematically, by: analysing a problem, shaping it into a researchable item, applying scientifically sound data and methods, testing results statistically and presenting results, realizing that "the" truth does not exist. The seventeen UN Sustainable Development Goals (SDG's) provide an attractive focus for inter- and transdisciplinary research approaches defining a series of options covering several SDG's in a system's analysis. Involved stakeholders and policy makers remain responsible to select their favorite option.
ERIC Educational Resources Information Center
Rubinson, Laura E.
2010-01-01
More than one third of American children cannot read at a basic level by fourth grade (Lee, Grigg, & Donahue, 2007) and those numbers are even higher for African American, Hispanic and poor White students (Boorman et al., 2007). These are alarming statistics given that the ability to read is the most basic and fundamental skill for academic…
ERIC Educational Resources Information Center
Chukwu, Leo C.; Eze, Thecla A. Y.; Agada, Fidelia Chinyelugo
2016-01-01
The study examined the availability of instructional materials at the basic education level in Enugu Education Zone of Enugu State, Nigeria. One research question and one hypothesis guided the study. The research question was answered using mean and grand mean ratings, while the hypothesis was tested using t-test statistics at 0.05 level of…
Development of a funding, cost, and spending model for satellite projects
NASA Technical Reports Server (NTRS)
Johnson, Jesse P.
1989-01-01
The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.
Basic statistics with Microsoft Excel: a review.
Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-06-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.
Basic statistics with Microsoft Excel: a review
Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-01-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690
Mogil, Jeffrey S
2017-03-22
The poor record of basic-to-clinical translation in recent decades has led to speculation that preclinical research is "irreproducible", and this irreproducibility in turn has largely been attributed to deficiencies in reporting and statistical practices. There are, however, a number of other reasonable explanations of both poor translation and difficulties in one laboratory replicating the results of another. This article examines these explanations as they pertain to preclinical pain research. I submit that many instances of apparent irreproducibility are actually attributable to interactions between the phenomena and interventions under study and "latent" environmental factors affecting the rodent subjects. These environmental variables-often causing stress, and related to both animal husbandry and the specific testing context-differ greatly between labs, and continue to be identified, suggesting that our knowledge of their existence is far from complete. In pain research in particular, laboratory stressors can produce great variability of unpredictable direction, as stress is known to produce increases (stress-induced hyperalgesia) or decreases (stress-induced analgesia) in pain depending on its parameters. Much greater attention needs to be paid to the study of the laboratory environment if replication and translation are to be improved.
A. C. C. Fact Book: A Statistical Profile of Allegany Community College and the Community It Serves.
ERIC Educational Resources Information Center
Andersen, Roger C.
This document is intended to be an authoritative compilation of frequently referenced basic facts concerning Allegany Community College (ACC) in Maryland. It is a statistical profile of ACC and the community it serves, divided into six sections: enrollment, students, faculty, community, support services, and general college related information.…
Basic Mathematics Test Predicts Statistics Achievement and Overall First Year Academic Success
ERIC Educational Resources Information Center
Fonteyne, Lot; De Fruyt, Filip; Dewulf, Nele; Duyck, Wouter; Erauw, Kris; Goeminne, Katy; Lammertyn, Jan; Marchant, Thierry; Moerkerke, Beatrijs; Oosterlinck, Tom; Rosseel, Yves
2015-01-01
In the psychology and educational science programs at Ghent University, only 36.1% of the new incoming students in 2011 and 2012 passed all exams. Despite availability of information, many students underestimate the scientific character of social science programs. Statistics courses are a major obstacle in this matter. Not all enrolling students…
ERIC Educational Resources Information Center
Schweizer, Karl; Steinwascher, Merle; Moosbrugger, Helfried; Reiss, Siegbert
2011-01-01
The development of research methodology competency is a major aim of the psychology curriculum at universities. Usually, three courses concentrating on basic statistics, advanced statistics and experimental methods, respectively, serve the achievement of this aim. However, this traditional curriculum-based course structure gives rise to the…
ERIC Educational Resources Information Center
Maric, Marija; Wiers, Reinout W.; Prins, Pier J. M.
2012-01-01
Despite guidelines and repeated calls from the literature, statistical mediation analysis in youth treatment outcome research is rare. Even more concerning is that many studies that "have" reported mediation analyses do not fulfill basic requirements for mediation analysis, providing inconclusive data and clinical implications. As a result, after…
Statistical estimators for monitoring spotted owls in Oregon and Washington in 1987.
Tlmothy A. Max; Ray A. Souter; Kathleen A. O' Halloran
1990-01-01
Spotted owls (Strix occidentalis) were monitored on 11 National Forests in the Pacific Northwest Region of the USDA Forest Service between March and August of 1987. The basic intent of monitoring was to provide estimates of occupancy and reproduction rates for pairs of spotted owls. This paper documents the technical details of the statistical...
Statistical techniques for sampling and monitoring natural resources
Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado
2004-01-01
We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....
Peer-Assisted Learning in Research Methods and Statistics
ERIC Educational Resources Information Center
Stone, Anna; Meade, Claire; Watling, Rosamond
2012-01-01
Feedback from students on a Level 1 Research Methods and Statistics module, studied as a core part of a BSc Psychology programme, highlighted demand for additional tutorials to help them to understand basic concepts. Students in their final year of study commonly request work experience to enhance their employability. All students on the Level 1…
Adult Basic and Secondary Education Program Statistics. Fiscal Year 1976.
ERIC Educational Resources Information Center
Cain, Sylvester H.; Whalen, Barbara A.
Reports submitted to the National Center for Education Statistics provided data for this compilation and tabulation of data on adult participants in U.S. educational programs in fiscal year 1976. In the summary section introducing the charts, it is noted that adult education programs funded under P.L. 91-230 served over 1.6 million persons--an…
ERIC Educational Resources Information Center
Goodman, Leroy V., Ed.
This is the third edition of the Education Almanac, an assemblage of statistics, facts, commentary, and basic background information about the conduct of schools in the United States. Features of this variegated volume include an introductory section on "Education's Newsiest Developments," followed by some vital educational statistics, a set of…
Theory of Financial Risk and Derivative Pricing
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2009-01-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Theory of Financial Risk and Derivative Pricing - 2nd Edition
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2003-12-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
CORSSA: Community Online Resource for Statistical Seismicity Analysis
NASA Astrophysics Data System (ADS)
Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.
2011-12-01
Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.
Predicting Success in Psychological Statistics Courses.
Lester, David
2016-06-01
Many students perform poorly in courses on psychological statistics, and it is useful to be able to predict which students will have difficulties. In a study of 93 undergraduates enrolled in Statistical Methods (18 men, 75 women; M age = 22.0 years, SD = 5.1), performance was significantly associated with sex (female students performed better) and proficiency in algebra in a linear regression analysis. Anxiety about statistics was not associated with course performance, indicating that basic mathematical skills are the best correlate for performance in statistics courses and can usefully be used to stream students into classes by ability. © The Author(s) 2016.
... Surveillance References Birth Defects COUNT Data & Statistics Research Articles & Key Findings About Us Partners Links to Other Websites Information For… Media Policy Makers Folic Acid Basics Language: English (US) ...
da Silva, R C V; de Sá, C C; Pascual-Vaca, Á O; de Souza Fontes, L H; Herbella Fernandes, F A M; Dib, R A; Blanco, C R; Queiroz, R A; Navarro-Rodriguez, T
2013-07-01
The treatment of gastroesophageal reflux disease may be clinical or surgical. The clinical consists basically of the use of drugs; however, there are new techniques to complement this treatment, osteopathic intervention in the diaphragmatic muscle is one these. The objective of the study is to compare pressure values in the examination of esophageal manometry of the lower esophageal sphincter (LES) before and immediately after osteopathic intervention in the diaphragm muscle. Thirty-eight patients with gastroesophageal reflux disease - 16 submitted to sham technique and 22 submitted osteopathic technique - were randomly selected. The average respiratory pressure (ARP) and the maximum expiratory pressure (MEP) of the LES were measured by manometry before and after osteopathic technique at the point of highest pressure. Statistical analysis was performed using the Student's t-test and Mann-Whitney, and magnitude of the technique proposed was measured using the Cohen's index. Statistically significant difference in the osteopathic technique was found in three out of four in relation to the group of patients who performed the sham technique for the following measures of LES pressure: ARP with P= 0.027. The MEP had no statistical difference (P= 0.146). The values of Cohen d for the same measures were: ARP with d= 0.80 and MEP d= 0.52. Osteopathic manipulative technique produces a positive increment in the LES region soon after its performance. © 2012 Copyright the Authors. Journal compilation © 2012, Wiley Periodicals, Inc. and the International Society for Diseases of the Esophagus.
New Methodology for Estimating Fuel Economy by Vehicle Class
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling
2011-01-01
Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumptionmore » rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.« less
NASA Astrophysics Data System (ADS)
Wright, Robyn; Thornberg, Steven M.
SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.
[Comment on] Statistical discrimination
NASA Astrophysics Data System (ADS)
Chinn, Douglas
In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.
1987-08-01
HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band
ERIC Educational Resources Information Center
Taylor, Marjorie; And Others
Anodizing, Inc., Teamsters Local 162, and Mt. Hood Community College (Oregon) developed a workplace literacy program for workers at Anodizing. These workers did not have the basic skill competencies to benefit from company training efforts in statistical process control and quality assurance and were not able to advance to lead and supervisory…
ERIC Educational Resources Information Center
Vizenor, Gerald
Opportunities Unlimited is a State-wide program to provide adult basic education (ABE) and training for Indians on Minnesota reservations and in Indian communities. An administrative center in Bemidji serves communities on the Red Lake, White Earth, and Leech Lake Reservations, and a Duluth center provides ABE and training for communities on the…
A quantitative comparison of corrective and perfective maintenance
NASA Technical Reports Server (NTRS)
Henry, Joel; Cain, James
1994-01-01
This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.
ERIC Educational Resources Information Center
Joireman, Jeff; Abbott, Martin L.
This report examines the overlap between student test results on the Iowa Test of Basic Skills (ITBS) and the Washington Assessment of Student Learning (WASL). The two tests were compared and contrasted in terms of content and measurement philosophy, and analyses studied the statistical relationship between the ITBS and the WASL. The ITBS assesses…
Fundamentals in Biostatistics for Research in Pediatric Dentistry: Part I - Basic Concepts.
Garrocho-Rangel, J A; Ruiz-Rodríguez, M S; Pozos-Guillén, A J
The purpose of this report was to provide the reader with some basic concepts in order to better understand the significance and reliability of the results of any article on Pediatric Dentistry. Currently, Pediatric Dentists need the best evidence available in the literature on which to base their diagnoses and treatment decisions for the children's oral care. Basic understanding of Biostatistics plays an important role during the entire Evidence-Based Dentistry (EBD) process. This report describes Biostatistics fundamentals in order to introduce the basic concepts used in statistics, such as summary measures, estimation, hypothesis testing, effect size, level of significance, p value, confidence intervals, etc., which are available to Pediatric Dentists interested in reading or designing original clinical or epidemiological studies.
Computer programs for computing particle-size statistics of fluvial sediments
Stevens, H.H.; Hubbell, D.W.
1986-01-01
Two versions of computer programs for inputing data and computing particle-size statistics of fluvial sediments are presented. The FORTRAN 77 language versions are for use on the Prime computer, and the BASIC language versions are for use on microcomputers. The size-statistics program compute Inman, Trask , and Folk statistical parameters from phi values and sizes determined for 10 specified percent-finer values from inputed size and percent-finer data. The program also determines the percentage gravel, sand, silt, and clay, and the Meyer-Peter effective diameter. Documentation and listings for both versions of the programs are included. (Author 's abstract)
ERIC Educational Resources Information Center
Gadway, Charles J.; Wilson, H.A.
This document provides statistical data on the 1974 and 1975 Mini-Assessment of Functional Literacy, which was designed to determine the extent of functional literacy among seventeen year olds in America. Also presented are data from comparable test items from the 1971 assessment. Three standards are presented, to allow different methods of…
ERIC Educational Resources Information Center
Novak, Elena; Johnson, Tristan E.; Tenenbaum, Gershon; Shute, Valerie J.
2016-01-01
The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. A storyline is a game-design element that connects scenes with the educational content. In order to…
ERIC Educational Resources Information Center
Waesche, Jessica S. Brown; Schatschneider, Christopher; Maner, Jon K.; Ahmed, Yusra; Wagner, Richard K.
2011-01-01
Rates of agreement among alternative definitions of reading disability and their 1- and 2-year stabilities were examined using a new measure of agreement, the affected-status agreement statistic. Participants were 288,114 first through third grade students. Reading measures were "Dynamic Indicators of Basic Early Literacy Skills" Oral…
ERIC Educational Resources Information Center
Biehler, Rolf; Frischemeier, Daniel; Podworny, Susanne
2017-01-01
Connecting data and chance is fundamental in statistics curricula. The use of software like TinkerPlots can bridge both worlds because the TinkerPlots Sampler supports learners in expressive modeling. We conducted a study with elementary preservice teachers with a basic university education in statistics. They were asked to set up and evaluate…
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…
ERIC Educational Resources Information Center
Novak, Elena
2012-01-01
The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. In addition, the study focused on examining the effects of a storyline GC on specific learning…
A statistical mechanics approach to autopoietic immune networks
NASA Astrophysics Data System (ADS)
Barra, Adriano; Agliari, Elena
2010-07-01
In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.
Statistical inference of the generation probability of T-cell receptors from sequence repertoires.
Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G
2012-10-02
Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Basic Publication Fundamentals.
ERIC Educational Resources Information Center
Savedge, Charles E., Ed.
Designed for students who produce newspapers and newsmagazines in junior high, middle, and elementary schools, this booklet is both a scorebook and a fundamentals text. The scorebook provides realistic criteria for judging publication excellence at these educational levels. All the basics for good publications are included in the text of the…
27 CFR 1.21 - Domestic producers, rectifiers, blenders, and warehousemen.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Domestic producers, rectifiers, blenders, and warehousemen. 1.21 Section 1.21 Alcohol, Tobacco Products and Firearms ALCOHOL AND... BOTTLING OF DISTILLED SPIRITS Basic Permits When Required § 1.21 Domestic producers, rectifiers, blenders...
Sound texture perception via statistics of the auditory periphery: Evidence from sound synthesis
McDermott, Josh H.; Simoncelli, Eero P.
2014-01-01
Rainstorms, insect swarms, and galloping horses produce “sound textures” – the collective result of many similar acoustic events. Sound textures are distinguished by temporal homogeneity, suggesting they could be recognized with time-averaged statistics. To test this hypothesis, we processed real-world textures with an auditory model containing filters tuned for sound frequencies and their modulations, and measured statistics of the resulting decomposition. We then assessed the realism and recognizability of novel sounds synthesized to have matching statistics. Statistics of individual frequency channels, capturing spectral power and sparsity, generally failed to produce compelling synthetic textures. However, combining them with correlations between channels produced identifiable and natural-sounding textures. Synthesis quality declined if statistics were computed from biologically implausible auditory models. The results suggest that sound texture perception is mediated by relatively simple statistics of early auditory representations, presumably computed by downstream neural populations. The synthesis methodology offers a powerful tool for their further investigation. PMID:21903084
Facts about Congenital Heart Defects
... Living With Heart Defects Data & Statistics Tracking & Research Articles & Key Findings Free Materials Multimedia and Tools Links to Other Websites Information For… Media Policy Makers Basics about Congenital Heart Defects Language: ...
... Cervical Cancer with the Right Test at the Right Time” Infographic How Is Cervical Cancer Diagnosed and Treated? Statistics Related Links Ovarian Cancer Basic Information What Are the Risk Factors? What Can ...
du Prel, Jean-Baptist; Röhrig, Bernd; Blettner, Maria
2009-02-01
In the era of evidence-based medicine, one of the most important skills a physician needs is the ability to analyze scientific literature critically. This is necessary to keep medical knowledge up to date and to ensure optimal patient care. The aim of this paper is to present an accessible introduction into critical appraisal of scientific articles. Using a selection of international literature, the reader is introduced to the principles of critical reading of scientific articles in medicine. For the sake of conciseness, detailed description of statistical methods is omitted. Widely accepted principles for critically appraising scientific articles are outlined. Basic knowledge of study design, structuring of an article, the role of different sections, of statistical presentations as well as sources of error and limitation are presented. The reader does not require extensive methodological knowledge. As far as necessary for critical appraisal of scientific articles, differences in research areas like epidemiology, clinical, and basic research are outlined. Further useful references are presented. Basic methodological knowledge is required to select and interpret scientific articles correctly.
A procedure for classifying textural facies in gravel‐bed rivers
Buffington, John M.; Montgomery, David R.
1999-01-01
Textural patches (i.e., grain‐size facies) are commonly observed in gravel‐bed channels and are of significance for both physical and biological processes at subreach scales. We present a general framework for classifying textural patches that allows modification for particular study goals, while maintaining a basic degree of standardization. Textures are classified using a two‐tier system of ternary diagrams that identifies the relative abundance of major size classes and subcategories of the dominant size. An iterative procedure of visual identification and quantitative grain‐size measurement is used. A field test of our classification indicates that it affords reasonable statistical discrimination of median grain size and variance of bed‐surface textures. We also explore the compromise between classification simplicity and accuracy. We find that statistically meaningful textural discrimination requires use of both tiers of our classification. Furthermore, we find that simplified variants of the two‐tier scheme are less accurate but may be more practical for field studies which do not require a high level of textural discrimination or detailed description of grain‐size distributions. Facies maps provide a natural template for stratifying other physical and biological measurements and produce a retrievable and versatile database that can be used as a component of channel monitoring efforts.
NASA Astrophysics Data System (ADS)
Allgood, Glenn O.; Treece, Dale A.; Pearce, Fred J.; Bentley, Timothy B.
2000-08-01
Walter Reed Army Institute of Research and Oak Ridge National Laboratory have developed a prototype pulmonary diagnostic system capable of extracting signatures from adventitious lung sounds that characterize obstructive and/or restrictive flow. Examples of disorders that have been detailed include emphysema, asthma, pulmonary fibrosis, and pneumothorax. The system is based on the premise that acoustic signals associated with pulmonary disorders can be characterized by a set of embedded signatures unique to the disease. The concept is being extended to include cardio signals correlated with pulmonary data to provide an accurate and timely diagnoses of pulmonary function and distress in critically injured soldiers that will allow medical personnel to anticipate the need for accurate therapeutic intervention as well as monitor soldiers whose injuries may lead to pulmonary compromise later. The basic operation of the diagnostic system is as follows: (1) create an image from the acoustic signature based on higher order statistics, (2) deconstruct the image based on a predefined map, (3) compare the deconstructed image with stored images of pulmonary symptoms, and (4) classify the disorder based on a clustering of known symptoms and provide a statistical measure of confidence. The system has produced conformity between adults and infants and provided effective measures of physiology in the presence of noise.
Goe, Leon C; Baysac, Mary Anne S; Todd, Knox H; Linton, John A
2005-08-01
The lack of epidemiological studies has made it difficult to assess the extent of public health problems in North Korea. In the absence of empirical data, less intrusive study designs acceptable to the North Korean government could be developed to gauge the public's health. To this end we developed a basic oral health survey in order to assess the prevalence of untreated dental caries among children. A cross-sectional survey of 854 elementary school students was conducted in the city of Wonsan, North Korea. Students were screened and classified into one of three states of oral health: no caries, minor caries or severe dental caries. Verbal surveys were concurrently administered on children to collect basic information on oral health behaviours and demographic characteristics. Statistical analyses were performed to determine if any variables were significant predictors of oral health status category. Among the 854 students screened, we found 255 students with no caries (29.9%), 316 students with minor caries (37.0%), and 283 students with severe caries (33.1%). The majority of students (70.1%) screened had dental caries. Almost all of the students (98.5%) claimed to brush their teeth daily and 71.2% of students visited a dentist in the past year. There were no significant predictors of oral health status. The oral health of children in Wonsan, North Korea is comparable if not slightly better than the oral health status of children of similar age in countries with similar Social-Economic Status (SES). Basic oral health screens are useful to produce a snapshot of general oral health status among children in North Korea and may provide insight as to the general health of these children.
Consequences of common data analysis inaccuracies in CNS trauma injury basic research.
Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K
2013-05-15
The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.
Interpretation of correlations in clinical research.
Hung, Man; Bounsanga, Jerry; Voss, Maren Wright
2017-11-01
Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.
Structural Indicators on Achievement in Basic Skills in Europe--2016. Eurydice Report
ERIC Educational Resources Information Center
Parveva, Teodora
2017-01-01
This publication reviews key structures, policies and reforms in the area of achievement in the basic skills (literacy, mathematics and science). It contains fi ve indicators on policies for organising nationally standardised tests, producing national reports on achievement, using student performance data in school evaluation, addressing…
Basic Understanding of Earth Tunneling by Melting : Volume 2. Earth Structure and Design Solutions.
DOT National Transportation Integrated Search
1974-07-01
A novel technique, which employs the melting of rocks and soils as a means of excavating or tunneling while simultaneously generating a glass tunnel lining and/or primary support, was studied. The object of the study was to produce a good basic under...
Investigating Complexity Using Excel and Visual Basic.
ERIC Educational Resources Information Center
Zetie, K. P.
2001-01-01
Shows how some of the simple ideas in complexity can be investigated using a spreadsheet and a macro written in Visual Basic. Shows how the sandpile model of Bak, Chao, and Wiesenfeld can be simulated and animated. The model produces results that cannot easily be predicted from its properties. (Author/MM)
Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)
NASA Astrophysics Data System (ADS)
Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee
2010-12-01
Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review available statistical seismology software packages.
ERIC Educational Resources Information Center
Papaphotis, Georgios; Tsaparlis, Georgios
2008-01-01
Part 1 of the findings are presented of a quantitative study (n = 125) on basic quantum chemical concepts taught in the twelfth grade (age 17-18 years) in Greece. A paper-and-pencil test of fourteen questions was used. The study compared performance in five questions that tested recall of knowledge or application of algorithmic procedures (type-A…
Rockfalls in the Duratón canyon, central Spain: Inventory and statistical analysis
NASA Astrophysics Data System (ADS)
Tanarro, Luis M.; Muñoz, Julio
2012-10-01
This paper presents an initial analysis of the rockfall processes affecting the walls of the canyon of the River Duratón. This 34 km long meandering canyon in the basin of the River Duero in central Spain (41°18' N, 3°45' W) has evolved in a large-scale outcrop of Late Cretaceous calcareous rocks (dolomite and limestone) deformed into a series of asymmetrical folds. Its vertical scarps range from 80 to 100 m; its width varies from 150 to 300 m; and its floor is between 30 and 50 m wide. The research consisted of drawing up an inventory of rockfalls from a field survey and mapping the fallen blocks deposited on the basal talus or on the canyon floor, which in turn allowed the original location of each block on the scarps to be identified and located on the orthophotos available. A Digital Elevation Model (DEM) was produced using a Geographic Information System (GIS) and maps made of the aspects and slopes. The aspect of each rockfall data point was determined, and this initial database was completed with other significant parameters (location on the valley side, relationship with the tectonic structure and relative age). An approximate delimitation was also produced of the potential rockfall source area, by reclassifying the slopes according to morphometric criteria. The result is a geomorphic rockfall inventory map, showing the distribution of the rockfalls and a basic statistical analysis to allow a preliminary evaluation of the rockfall characteristics in relation to both their topoclimatic location (aspect) and their structural location (with or counter to the dip of the strata) and to the current geomorphic dynamic through a study of recent scars on the scarps. Recent rockfalls have also been related to the meteorological conditions in which they occurred.
Multi-view 3D echocardiography compounding based on feature consistency
NASA Astrophysics Data System (ADS)
Yao, Cheng; Simpson, John M.; Schaeffter, Tobias; Penney, Graeme P.
2011-09-01
Echocardiography (echo) is a widely available method to obtain images of the heart; however, echo can suffer due to the presence of artefacts, high noise and a restricted field of view. One method to overcome these limitations is to use multiple images, using the 'best' parts from each image to produce a higher quality 'compounded' image. This paper describes our compounding algorithm which specifically aims to reduce the effect of echo artefacts as well as improving the signal-to-noise ratio, contrast and extending the field of view. Our method weights image information based on a local feature coherence/consistency between all the overlapping images. Validation has been carried out using phantom, volunteer and patient datasets consisting of up to ten multi-view 3D images. Multiple sets of phantom images were acquired, some directly from the phantom surface, and others by imaging through hard and soft tissue mimicking material to degrade the image quality. Our compounding method is compared to the original, uncompounded echocardiography images, and to two basic statistical compounding methods (mean and maximum). Results show that our method is able to take a set of ten images, degraded by soft and hard tissue artefacts, and produce a compounded image of equivalent quality to images acquired directly from the phantom. Our method on phantom, volunteer and patient data achieves almost the same signal-to-noise improvement as the mean method, while simultaneously almost achieving the same contrast improvement as the maximum method. We show a statistically significant improvement in image quality by using an increased number of images (ten compared to five), and visual inspection studies by three clinicians showed very strong preference for our compounded volumes in terms of overall high image quality, large field of view, high endocardial border definition and low cavity noise.
Producing the target seed: Seed collection, treatment, and storage
Robert P. Karrfalt
2011-01-01
The role of high quality seeds in producing target seedlings is reviewed. Basic seed handling and upgrading techniques are summarized. Current advances in seed science and technology as well as those on the horizon are discussed.
Polyimide foams provide thermal insulation and fire protection
NASA Technical Reports Server (NTRS)
Rosser, R. W.
1972-01-01
Chemical reactions to produce polyimide foams for application as thermal insulation and fire prevention materials are discussed. Thermal and physical properties of the polyimides are described. Methods for improving basic formulations to produce desired qualitites are included.
Multimedia Instruction Puts Teachers in the Director's Chair.
ERIC Educational Resources Information Center
Trotter, Andrew
1990-01-01
Teachers can produce and direct their own instructional videos using computer-driven multimedia. Outlines the basics in combining audio and video technologies to produce videotapes that mix animated and still graphics, sound, and full-motion video. (MLF)
Some Basic Laws of Isotropic Turbulent Flow
NASA Technical Reports Server (NTRS)
Loitsianskii, L. G.
1945-01-01
An Investigation is made of the diffusion of artificially produced turbulence behind screens or other turbulence producers. The method is based on the author's concept of disturbance moment as a certain theoretically well-founded measure of turbulent disturbances.
Bello, Jibril Oyekunle
2013-11-14
Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.
NASA Astrophysics Data System (ADS)
Astuti, W.; Andika, R.; Nurjaman, F.
2018-01-01
The effect of basicity and reductant amount on the nickel and iron recovery of the nickel pig iron (NPI) production from Indonesian limonite ore was investigated in the experimental study using submerged electric arc furnace (SAF). Indonesian limonite ore used in this study originated from Sulawesi Island with the composition of Ni (1.26%) and Fe (43%). Metallurgical coke was applied as the reductant. This study showed that the the highest nickel and iron recovery as well as metal yield can be resulted from the basicity of 0.8 and reductant amount of 0.23 kg coke/kg limonite ore. Nickel content in the NPI produced was around 3 - 4%. It was concluded that this experiment can produce medium grade NPI.
Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun
2016-09-14
Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.
Non-Markovian near-infrared Q branch of HCl diluted in liquid Ar.
Padilla, Antonio; Pérez, Justo
2013-08-28
By using a non-Markovian spectral theory based in the Kubo cumulant expansion technique, we have qualitatively studied the infrared Q branch observed in the fundamental absorption band of HCl diluted in liquid Ar. The statistical parameters of the anisotropic interaction present in this spectral theory were calculated by means of molecular dynamics techniques, and found that the values of the anisotropic correlation times are significantly greater (by a factor of two) than those previously obtained by fitting procedures or microscopic cell models. This fact is decisive for the observation in the theoretical spectral band of a central Q resonance which is absent in the abundant previous researches carried out with the usual theories based in Kubo cumulant expansion techniques. Although the theory used in this work only allows a qualitative study of the Q branch, we can employ it to study the unknown characteristics of the Q resonance which are difficult to obtain with the quantum simulation techniques recently developed. For example, in this study we have found that the Q branch is basically a non-Markovian (or memory) effect produced by the spectral line interferences, where the PR interferential profile basically determines the Q branch spectral shape. Furthermore, we have found that the Q resonance is principally generated by the first rotational states of the first two vibrational levels, those more affected by the action of the dissolvent.
Using Internet Search Data to Produce State-Level Measures: The Case of Tea Party Mobilization
ERIC Educational Resources Information Center
DiGrazia, Joseph
2017-01-01
This study proposes using Internet search data from search engines like Google to produce state-level metrics that are useful in social science research. Generally, state-level research relies on demographic statistics, official statistics produced by government agencies, or aggregated survey data. However, each of these data sources has serious…
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
Detailed statistical tables on federal funds for research and development (R&D) activities are provided in this document. Tables are organized into the following sections: research, development, and R&D plant; R&D- agency, character of work, and performer; total research- agency, performer, and field of science; basic research- agency,…
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
Detailed statistical tables showing the funding levels of 92 federal agencies for research and development (R&D) are provided in this document. These tables are organized into the following sections: research, development, and R&D plant; R&D agency, character of work, and performer; total basic and applied applied research--agency,…
WASP (Write a Scientific Paper) using Excel -5: Quartiles and standard deviation.
Grech, Victor
2018-03-01
The almost inevitable descriptive statistics exercise that is undergone once data collection is complete, prior to inferential statistics, requires the acquisition of basic descriptors which may include standard deviation and quartiles. This paper provides pointers as to how to do this in Microsoft Excel™ and explains the relationship between the two. Copyright © 2018 Elsevier B.V. All rights reserved.
The maximum entropy production principle: two basic questions.
Martyushev, Leonid M
2010-05-12
The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
Detailed statistical tables on federal funds for research and development (R&D) are provided in this document. Tables are organized into the following sections: research, development, and R&D plant; R&D--agency, character of work, and performer; total research--agency, performer, and field of science; basic research--agency, performer,…
ERIC Educational Resources Information Center
Rupp, Andre A.
2007-01-01
One of the most revolutionary advances in psychometric research during the last decades has been the systematic development of statistical models that allow for cognitive psychometric research (CPR) to be conducted. Many of the models currently available for such purposes are extensions of basic latent variable models in item response theory…
Current state of the art for statistical modeling of species distributions [Chapter 16
Troy M. Hegel; Samuel A. Cushman; Jeffrey Evans; Falk Huettmann
2010-01-01
Over the past decade the number of statistical modelling tools available to ecologists to model species' distributions has increased at a rapid pace (e.g. Elith et al. 2006; Austin 2007), as have the number of species distribution models (SDM) published in the literature (e.g. Scott et al. 2002). Ten years ago, basic logistic regression (Hosmer and Lemeshow 2000)...
ERIC Educational Resources Information Center
Rahim, Syed A.
Based in part on a list developed by the United Nations Educational, Scientific, and Cultural Organization (UNESCO) for use in Afghanistan, this document presents a comprehensive checklist of items of statistical and descriptive data required for planning a national communication system. It is noted that such a system provides the vital…
Basic Sciences Fertilizing Clinical Microbiology and Infection Management
2017-01-01
Abstract Basic sciences constitute the most abundant sources of creativity and innovation, as they are based on the passion of knowing. Basic knowledge, in close and fertile contact with medical and public health needs, produces distinct advancements in applied sciences. Basic sciences play the role of stem cells, providing material and semantics to construct differentiated tissues and organisms and enabling specialized functions and applications. However, eventually processes of “practice deconstruction” might reveal basic questions, as in de-differentiation of tissue cells. Basic sciences, microbiology, infectious diseases, and public health constitute an epistemological gradient that should also be an investigational continuum. The coexistence of all these interests and their cross-fertilization should be favored by interdisciplinary, integrative research organizations working simultaneously in the analytical and synthetic dimensions of scientific knowledge. PMID:28859345
SIGPI. Fault Tree Cut Set System Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patenaude, C.J.
1992-01-13
SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less
SIGPI. Fault Tree Cut Set System Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patenaude, C.J.
1992-01-14
SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less
Probability sampling in legal cases: Kansas cellphone users
NASA Astrophysics Data System (ADS)
Kadane, Joseph B.
2012-10-01
Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.
... infections—down from 41,800 in 2010. a Gay, bisexual, and other men who have sex with ... HIV infections by transmission category , we see that gay, bisexual, and other men who have sex with ...
Understanding your cancer prognosis
... about: Treatment Palliative care Personal matters such as finances Knowing what to expect may make it easier ... treatment. www.cancer.net/navigating-cancer-care/cancer-basics/understanding-statistics-used-guide-prognosis-and-evaluate-treatment . ...
Powerlaw: a Python package for analysis of heavy-tailed distributions.
Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar
2014-01-01
Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.
Statistics and Discoveries at the LHC (1/4)
Cowan, Glen
2018-02-09
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (3/4)
Cowan, Glen
2018-02-19
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (4/4)
Cowan, Glen
2018-05-22
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Statistics and Discoveries at the LHC (2/4)
Cowan, Glen
2018-04-26
The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.
Understanding quantitative research: part 1.
Hoe, Juanita; Hoare, Zoë
This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.
Dexter, Franklin; Shafer, Steven L
2017-03-01
Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.
Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun
2018-01-01
To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.
The Evolution of Random Number Generation in MUVES
2017-01-01
mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results identical to the current...MUVES, includ- ing the mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results...questionable numerical and statistical properties. The development of the modern system is traced through software change requests, resulting in a random number
Katapultos: Teaching Basic Statistics with Ballistics.
ERIC Educational Resources Information Center
Fitzgerald, Mike
2001-01-01
Describes the use of catapults as a way to increase math, science, and technology correlations within the classroom. Includes detailed instructions, a list of materials for building a catapult, and print and Internet resources. (JOW)
... Hearing Loss Homepage Basics Noise-Induced Hearing Loss Genetics of Hearing Loss Screening & Diagnosis Types of Hearing Loss About Sound Treatment & Intervention Services Learning Language Bacterial Meningitis Studies Data & Statistics EHDI Annual Data 2016 2015 2014 2013 ...
... Hearing Loss Homepage Basics Noise-Induced Hearing Loss Genetics of Hearing Loss Screening & Diagnosis Types of Hearing Loss About Sound Treatment & Intervention Services Learning Language Bacterial Meningitis Studies Data & Statistics EHDI Annual Data 2016 2015 2014 2013 ...
77 FR 61791 - System of Records; Presidential Management Fellows Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
... program personnel for the following reasons: a. To determine basic program eligibility and to evaluate... descriptive statistics and analytical studies in support of the function for which the records are collected...
... Honor Donation Donate by phone at 1-800-DIABETES (1-800-342-2383) Donate by mail Why Give? ... My Health Advisor Tools to Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics ...
... Honor Donation Donate by phone at 1-800-DIABETES (1-800-342-2383) Donate by mail Why Give? ... My Health Advisor Tools to Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics ...
... Honor Donation Donate by phone at 1-800-DIABETES (1-800-342-2383) Donate by mail Why Give? ... My Health Advisor Tools to Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics ...
Using basic statistics on the individual patient's own numeric data.
Hart, John
2012-12-01
This theoretical report gives an example for how coefficient of variation (CV) and quartile analysis (QA) to assess outliers might be able to be used to analyze numeric data in practice for an individual patient. A patient was examined for 8 visits using infrared instrumentation for measurement of mastoid fossa temperature differential (MFTD) readings. The CV and QA were applied to the readings. The participant also completed the Short Form-12 health perception survey on each visit, and these findings were correlated with CV to determine if CV had outcomes support (clinical significance). An outlier MFTD reading was observed on the eighth visit according to QA that coincided with the largest CV value for the MFTDs. Correlations between the Short Form-12 and CV were low to negligible, positive, and statistically nonsignificant. This case provides an example of how basic statistical analyses could possibly be applied to numerical data in chiropractic practice for an individual patient. This might add objectivity to analyzing an individual patient's data in practice, particularly if clinical significance of a clinical numerical finding is unknown.
Basic biostatistics for post-graduate students
Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.
2012-01-01
Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501
An Exercise in Exploring Big Data for Producing Reliable Statistical Information.
Rey-Del-Castillo, Pilar; Cardeñosa, Jesús
2016-06-01
The availability of copious data about many human, social, and economic phenomena is considered an opportunity for the production of official statistics. National statistical organizations and other institutions are more and more involved in new projects for developing what is sometimes seen as a possible change of paradigm in the way statistical figures are produced. Nevertheless, there are hardly any systems in production using Big Data sources. Arguments of confidentiality, data ownership, representativeness, and others make it a difficult task to get results in the short term. Using Call Detail Records from Ivory Coast as an illustration, this article shows some of the issues that must be dealt with when producing statistical indicators from Big Data sources. A proposal of a graphical method to evaluate one specific aspect of the quality of the computed figures is also presented, demonstrating that the visual insight provided improves the results obtained using other traditional procedures.
Calculation of streamflow statistics for Ontario and the Great Lakes states
Piggott, Andrew R.; Neff, Brian P.
2005-01-01
Basic, flow-duration, and n-day frequency statistics were calculated for 779 current and historical streamflow gages in Ontario and 3,157 streamflow gages in the Great Lakes states with length-of-record daily mean streamflow data ending on December 31, 2000 and September 30, 2001, respectively. The statistics were determined using the U.S. Geological Survey’s SWSTAT and IOWDM, ANNIE, and LIBANNE software and Linux shell and PERL programming that enabled the mass processing of the data and calculation of the statistics. Verification exercises were performed to assess the accuracy of the processing and calculations. The statistics and descriptions, longitudes and latitudes, and drainage areas for each of the streamflow gages are summarized in ASCII text files and ESRI shapefiles.
Kane, Lesley A; Yung, Christina K; Agnetti, Giulio; Neverova, Irina; Van Eyk, Jennifer E
2006-11-01
Separation of basic proteins with 2-DE presents technical challenges involving protein precipitation, load limitations, and streaking. Cardiac mitochondria are enriched in basic proteins and difficult to resolve by 2-DE. We investigated two methods, cup and paper bridge, for sample loading of this subproteome into the basic range (pH 6-11) gels. Paper bridge loading consistently produced improved resolution of both analytical and preparative protein loads. A unique benefit of this technique is that proteins retained in the paper bridge after loading basic gels can be reloaded onto lower pH gradients (pH 4-7), allowing valued samples to be analyzed on multiple pH ranges.
Eisner, Emily; Drake, Richard; Lobban, Fiona; Bucci, Sandra; Emsley, Richard; Barrowclough, Christine
2018-02-01
Early signs interventions show promise but could be further developed. A recent review suggested that 'basic symptoms' should be added to conventional early signs to improve relapse prediction. This study builds on preliminary evidence that basic symptoms predict relapse and aimed to: 1. examine which phenomena participants report prior to relapse and how they describe them; 2. determine the best way of identifying pre-relapse basic symptoms; 3. assess current practice by comparing self- and casenote-reported pre-relapse experiences. Participants with non-affective psychosis were recruited from UK mental health services. In-depth interviews (n=23), verbal checklists of basic symptoms (n=23) and casenote extracts (n=208) were analysed using directed content analysis and non-parametric statistical tests. Three-quarters of interviewees reported basic symptoms and all reported conventional early signs and 'other' pre-relapse experiences. Interviewees provided rich descriptions of basic symptoms. Verbal checklist interviews asking specifically about basic symptoms identified these experiences more readily than open questions during in-depth interviews. Only 5% of casenotes recorded basic symptoms; interviewees were 16 times more likely to report basic symptoms than their casenotes did. The majority of interviewees self-reported pre-relapse basic symptoms when asked specifically about these experiences but very few casenotes reported these symptoms. Basic symptoms may be potent predictors of relapse that clinicians miss. A self-report measure would aid monitoring of basic symptoms in routine clinical practice and would facilitate a prospective investigation comparing basic symptoms and conventional early signs as predictors of relapse. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Background Information and User’s Guide for MIL-F-9490
1975-01-01
requirements, although different analysis results will apply to each requirement. Basic differences between the two realibility requirements are: MIL-F-8785B...provides the rationale for establishing such limits. The specific risk analysis comprises the same data which formed the average risk analysis , except...statistical analysis will be based on statistical data taken using limited exposure Limes of components and equipment. The exposure times and resulting
Transparency, Accountability, and Engagement: A Recipe for Building Trust in Policing
2017-06-01
Toward Community-orientated Policing: Potential, Basic Requirements, and Threshold Questions,” Crime and Delinquency 33 (1987): 6–30. 49 More, Current...States,” in Sourcebook of Criminal Justice Statistics Online, accessed June 4, 2017, http://www.albany.edu/sourcebook/csv/ t2332011.csv. 89 Gary...to-date crime statistics , and empowered them to think creatively to develop individualized plans to address crime trends and conditions. His focus
Lee, Fu-Jung; Wu, Chih-Cheng; Peng, Shih-Yen; Fan, Kuo-Tung
2007-09-01
Many anesthesiologists in medical centers (MC) or in anesthesiologist-training hospitals (ATH) are accustomed to present their research data in the form of poster abstracts at the annual meetings of Taiwan Society of Anesthesiologists (TSA) to represent their academic gainings in a designated period of time. However, an orphaned P value without mentioning the related specified statistical test has frequently been found in these articles. The difference in presentation of statistical test after P value between MC/ATH and non-MC/non-ATH in recent three TSA consecutive annual meetings was explored in this article. We collected the proceedings handbooks of TSA annual meetings in a period spanning 3 yrs (2003 to 2005) and analyzed the hospital characteristic of first institute-byliner in the poster abstract. Data were analyzed with Fisher's exact test and statistical significance was assumed if P < 0.05. Included were 101 poster abstracts with byliners of 20 hospitals. Only 2 of the 20 hospitals were accredited as non-ATH and 4 as non-MC. There were 64 (63%) abstracts without specified statistical test after P value and no significant difference was found among each category. (P = 0.47 in ATH vs. non-ATH and P = 0.07 in MC vs. non-MC). The basic concept of P value with specified statistical test was not applicable comprehensively in poster abstracts of the annual conferences. Based on our wishful intention, we suggest that the anesthesia administrators and senior anesthesiologists at ATH or MC, and the members of the committee responsible for running academic affairs in TSA, should pay attention to this prodigy and work together to improve our basic statistics in poster presentation.
Development of polytoxicomania in function of defence from psychoticism.
Nenadović, Milutin M; Sapić, Rosa
2011-01-01
Polytoxicomanic proportions in subpopulations of youth have been growing steadily in recent decades, and this trend is pan-continental. Psychoticism is a psychological construct that assumes special basic dimensions of personality disintegration and cognitive functions. Psychoticism may, in general, be the basis of pathological functioning of youth and influence the patterns of thought, feelings and actions that cause dysfunction. The aim of this study was to determine the distribution of basic dimensions of psychoticism for commitment of youth to abuse psychoactive substances (PAS) in order to reduce disturbing intrapsychic experiences or manifestation of psychotic symptoms. For the purpose of this study, two groups of respondents were formed, balanced by age, gender and family structure of origin (at least one parent alive). The study applied a DELTA-9 instrument for assessment of cognitive disintegration in function of establishing psychoticism and its operationalization. The obtained results were statistically analyzed. From the parameters of descriptive statistics, the arithmetic mean was calculated with measures of dispersion. A cross-tabular analysis of variables tested was performed, as well as statistical significance with Pearson's chi2-test, and analysis of variance. Age structure and gender are approximately represented in the group of polytoximaniacs and the control group. Testing did not confirm the statistically significant difference (p > 0.5). Statistical methodology established that they significantly differed in most variables of psychoticism, polytoxicomaniacs compared with a control group of respondents. Testing confirmed a high statistical significance of differences of variables of psychoticism in the group of respondents for p < 0.001 to p < 0.01. A statistically significant representation of the dimension of psychoticism in the polytoxicomaniac group was established. The presence of factors concerning common executive dysfunction was emphasized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
2011-01-01
Analysis of the material protection, control, and accountability (MPC&A) system is necessary to understand the limits and vulnerabilities of the system to internal threats. A self-appraisal helps the facility be prepared to respond to internal threats and reduce the risk of theft or diversion of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) fault tree was developed to depict the failure of the MPC&A system as a result of poor practices and random failures in the MC&A system. It can also be employed as a basis for assessing deliberate threats against a facility. MSET uses faultmore » tree analysis, which is a top-down approach to examining system failure. The analysis starts with identifying a potential undesirable event called a 'top event' and then determining the ways it can occur (e.g., 'Fail To Maintain Nuclear Materials Under The Purview Of The MC&A System'). The analysis proceeds by determining how the top event can be caused by individual or combined lower level faults or failures. These faults, which are the causes of the top event, are 'connected' through logic gates. The MSET model uses AND-gates and OR-gates and propagates the effect of event failure using Boolean algebra. To enable the fault tree analysis calculations, the basic events in the fault tree are populated with probability risk values derived by conversion of questionnaire data to numeric values. The basic events are treated as independent variables. This assumption affects the Boolean algebraic calculations used to calculate results. All the necessary calculations are built into the fault tree codes, but it is often useful to estimate the probabilities manually as a check on code functioning. The probability of failure of a given basic event is the probability that the basic event primary question fails to meet the performance metric for that question. The failure probability is related to how well the facility performs the task identified in that basic event over time (not just one performance or exercise). Fault tree calculations provide a failure probability for the top event in the fault tree. The basic fault tree calculations establish a baseline relative risk value for the system. This probability depicts relative risk, not absolute risk. Subsequent calculations are made to evaluate the change in relative risk that would occur if system performance is improved or degraded. During the development effort of MSET, the fault tree analysis program used was SAPHIRE. SAPHIRE is an acronym for 'Systems Analysis Programs for Hands-on Integrated Reliability Evaluations.' Version 1 of the SAPHIRE code was sponsored by the Nuclear Regulatory Commission in 1987 as an innovative way to draw, edit, and analyze graphical fault trees primarily for safe operation of nuclear power reactors. When the fault tree calculations are performed, the fault tree analysis program will produce several reports that can be used to analyze the MPC&A system. SAPHIRE produces reports showing risk importance factors for all basic events in the operational MC&A system. The risk importance information is used to examine the potential impacts when performance of certain basic events increases or decreases. The initial results produced by the SAPHIRE program are considered relative risk values. None of the results can be interpreted as absolute risk values since the basic event probability values represent estimates of risk associated with the performance of MPC&A tasks throughout the material balance area (MBA). The RRR for a basic event represents the decrease in total system risk that would result from improvement of that one event to a perfect performance level. Improvement of the basic event with the greatest RRR value produces a greater decrease in total system risk than improvement of any other basic event. Basic events with the greatest potential for system risk reduction are assigned performance improvement values, and new fault tree calculations show the improvement in total system risk. The operational impact or cost-effectiveness from implementing the performance improvements can then be evaluated. The improvements being evaluated can be system performance improvements, or they can be potential, or actual, upgrades to the system. The RIR for a basic event represents the increase in total system risk that would result from failure of that one event. Failure of the basic event with the greatest RIR value produces a greater increase in total system risk than failure of any other basic event. Basic events with the greatest potential for system risk increase are assigned failure performance values, and new fault tree calculations show the increase in total system risk. This evaluation shows the importance of preventing performance degradation of the basic events. SAPHIRE identifies combinations of basic events where concurrent failure of the events results in failure of the top event.« less
Time domain structures in a colliding magnetic flux rope experiment
NASA Astrophysics Data System (ADS)
Tang, Shawn Wenjie; Gekelman, Walter; Dehaas, Timothy; Vincena, Steve; Pribyl, Patrick
2017-10-01
Electron phase-space holes, regions of positive potential on the scale of the Debye length, have been observed in auroras as well as in laboratory experiments. These potential structures, also known as Time Domain Structures (TDS), are packets of intense electric field spikes that have significant components parallel to the local magnetic field. In an ongoing investigation at UCLA, TDS were observed on the surface of two magnetized flux ropes produced within the Large Plasma Device (LAPD). A barium oxide (BaO) cathode was used to produce an 18 m long magnetized plasma column and a lanthanum hexaboride (LaB6) source was used to create 11 m long kink unstable flux ropes. Using two probes capable of measuring the local electric and magnetic fields, correlation analysis was performed on tens of thousands of these structures and their propagation velocities, probability distribution function and spatial distribution were determined. The TDS became abundant as the flux ropes collided and appear to emanate from the reconnection region in between them. In addition, a preliminary analysis of the permutation entropy and statistical complexity of the data suggests that the TDS signals may be chaotic in nature. Work done at the Basic Plasma Science Facility (BaPSF) at UCLA which is supported by DOE and NSF.
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
T's and Blues. Specialized Information Service.
ERIC Educational Resources Information Center
Do It Now Foundation, Phoenix, AZ.
This compilation of journal articles provides basic information on abuse of Talwin, a mild prescription painkiller (T's), and Pyribenzamine, a nonprescription antihistimine (Blues). These two drugs, taken in combination, produce an effect similar to that produced by heroin. Stories from "Drug Survival News,""Emergency…
Donald A. Haines; William A. Main; Eugene F. McNamara
1978-01-01
Describes factors that contribute to forest fires in Pennsylvania. Includes an analysis of basic statistics; distribution of fires during normal, drought, and wet years; fire cause, fire activity by day-of-week; multiple-fire day; and fire climatology.
CADDIS Volume 4. Data Analysis: Basic Analyses
Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.
Geothermal Systems for School.
ERIC Educational Resources Information Center
Dinse, David H.
1998-01-01
Describes an award-winning school heating and cooling system in which two energy-efficient technologies, variable-flow pumping and geothermal heat pumps, were combined. The basic system schematic and annual energy use and cost savings statistics are provided. (GR)
Implementation of a worksite educational program focused on promoting healthy eating habits.
Tanagra, Dimitra; Panidis, Dimitris; Tountas, Yannis; Remoudaki, Elina; Alexopoulos, Evangelos C
2013-01-01
To estimate the effectiveness of a short-term educational-counseling worksite program focused on lipid intake, by monitoring the possible change on nutrition knowledge and eating habits. an 8-week educational program based on the Health Belief Model was implemented in a honey packaging and sales company in Greece. 20 out of the 29 employees initially enrolled completed the program. Knowledge level and eating habits were evaluated prior and after the intervention by the "Nutrition Knowledge Questionnaire" and the "Food Habits Questionnaire". ANOVA, Spearman rho test and paired Wilcoxon test were employed in statistical analysis. Non smokers and those with higher educational level had healthier eating habits. Knowledge following the intervention was significantly improved concerning recommendations and basic food ingredients but as far as eating habits were concerned, scores were not improved significantly, while intake of fried food was increased. Short-term interventions may produce substantial improvement in knowledge but not necessarily modifications in unhealthy eating habits.
Modeling the Solar Convective Dynamo and Emerging Flux
NASA Astrophysics Data System (ADS)
Fan, Y.
2017-12-01
Significant advances have been made in recent years in global-scale fully dynamic three-dimensional convective dynamo simulations of the solar/stellar convective envelopes to reproduce some of the basic features of the Sun's large-scale cyclic magnetic field. It is found that the presence of the dynamo-generated magnetic fields plays an important role for the maintenance of the solar differential rotation, without which the differential rotation tends to become anti-solar (with a faster rotating pole instead of the observed faster rotation at the equator). Convective dynamo simulations are also found to produce emergence of coherent super-equipartition toroidal flux bundles with a statistically significant mean tilt angle that is consistent with the mean tilt of solar active regions. The emerging flux bundles are sheared by the giant cell convection into a forward leaning loop shape with its leading side (in the direction of rotation) pushed closer to the strong downflow lanes. Such asymmetric emerging flux pattern may lead to the observed asymmetric properties of solar active regions.
Spatially distributed fiber sensor with dual processed outputs
NASA Astrophysics Data System (ADS)
Xu, X.; Spillman, William B., Jr.; Claus, Richard O.; Meissner, K. E.; Chen, K.
2005-05-01
Given the rapid aging of the world"s population, improvements in technology for automation of patient care and documentation are badly needed. We have previously demonstrated a 'smart bed' that can non-intrusively monitor a patient in bed and determine a patient's respiration, heart rate and movement without intrusive or restrictive medical measurements. This is an application of spatially distributed integrating fiber optic sensors. The basic concept is that any patient movement that also moves an optical fiber within a specified area will produce a change in the optical signal. Two modal modulation approaches were considered, a statistical mode (STM) sensor and a high order mode excitation (HOME) sensor. The present design includes an STM sensor combined with a HOME sensor, using both modal modulation approaches. A special lens system allows only the high order modes of the optical fiber to be excited and coupled into the sensor. For handling output from the dual STM-HOME sensor, computer processing methods are discussed that offer comprehensive perturbation analysis for more reliable patient monitoring.
Techniques for generation of control and guidance signals derived from optical fields, part 2
NASA Technical Reports Server (NTRS)
Hemami, H.; Mcghee, R. B.; Gardner, S. R.
1971-01-01
The development is reported of a high resolution technique for the detection and identification of landmarks from spacecraft optical fields. By making use of nonlinear regression analysis, a method is presented whereby a sequence of synthetic images produced by a digital computer can be automatically adjusted to provide a least squares approximation to a real image. The convergence of the method is demonstrated by means of a computer simulation for both elliptical and rectangular patterns. Statistical simulation studies with elliptical and rectangular patterns show that the computational techniques developed are able to at least match human pattern recognition capabilities, even in the presence of large amounts of noise. Unlike most pattern recognition techniques, this ability is unaffected by arbitrary pattern rotation, translation, and scale change. Further development of the basic approach may eventually allow a spacecraft or robot vehicle to be provided with an ability to very accurately determine its spatial relationship to arbitrary known objects within its optical field of view.
ERIC Educational Resources Information Center
Earl, Lorna L.
This series of manuals describing and illustrating the Statistical Package for the Social Sciences (SPSS) was planned as a self-teaching instrument, beginning with the basics and progressing to an advanced level. Information on what the searcher must know to define the data and write a program for preliminary analysis is contained in manual 1,…
ERIC Educational Resources Information Center
Commonwealth Inst., London (England).
Commonwealth Ministries of Education were asked to report on how they are undertaking the improvement of the quality of basic education in their respective countries. The papers in this volume focus on: (1) Antigua; (2) Bermuda; (3) India; (4) St. Kitts and Nevis; and (5) Turks and Caicos Islands. Charts and statistical data support each country's…
How Does Sam Feel?: Children's Labelling and Drawing of Basic Emotions
ERIC Educational Resources Information Center
Brechet, Claire; Baldy, Rene; Picard, Delphine
2009-01-01
This study compares the ability of children aged from 6 to 11 to freely produce emotional labels based on detailed scenarios (labelling task), and their ability to depict basic emotions in their human figure drawing (subsequent drawing task). This comparison assesses the relevance of the use of a human figure drawing task in order to test…
Applied vs Basic Research: On Maintaining Your Balance with a Foot in Each Camp.
ERIC Educational Resources Information Center
Martin, David W.
The paper discusses a number of issues concerning the practical usefulness of cognitive psychology research, and presents a case study of pilot training methods to illustrate a model of research processes that produces outcomes which contribute to both basic and applied research goals. Research studies are described as varying in the degree to…
1991-12-01
effective (19:15) Figure 2 details a flowchart of the basic steps in prototyping. The basic concept behind prototyping is to quickly produce a working...One approach to overcoming this is to structure the document relative to the experience level of the user (14:49). A "novice" or beginner would
ERIC Educational Resources Information Center
Berney, Tomi D.; Barrera, Marbella
In its second year, the Bilingual Academic Services and Integrated Career Systems (BASICS) Program served 104 limited-English-proficient students at Bayside High School in Queens (New York City). Project goals were to develop English literacy skills, produce an organizing framework of thinking and language skills across the curriculum, generate a…
Enhancing Maintenance and Generalization of Incremental Rehearsal through Theory-Based Modifications
ERIC Educational Resources Information Center
Petersen-Brown, Shawna M.
2013-01-01
The attainment of basic early literacy skills at an early age is one way to ensure children become proficient readers as adults. Word recognition is an important basic early literacy skill that is related to reading fluency and overall reading competency. Incremental rehearsal (IR) is a flashcard technique that has produced strong outcomes for a…
Apes are intuitive statisticians.
Rakoczy, Hannes; Clüver, Annette; Saucke, Liane; Stoffregen, Nicole; Gräbener, Alice; Migura, Judith; Call, Josep
2014-04-01
Inductive learning and reasoning, as we use it both in everyday life and in science, is characterized by flexible inferences based on statistical information: inferences from populations to samples and vice versa. Many forms of such statistical reasoning have been found to develop late in human ontogeny, depending on formal education and language, and to be fragile even in adults. New revolutionary research, however, suggests that even preverbal human infants make use of intuitive statistics. Here, we conducted the first investigation of such intuitive statistical reasoning with non-human primates. In a series of 7 experiments, Bonobos, Chimpanzees, Gorillas and Orangutans drew flexible statistical inferences from populations to samples. These inferences, furthermore, were truly based on statistical information regarding the relative frequency distributions in a population, and not on absolute frequencies. Intuitive statistics in its most basic form is thus an evolutionarily more ancient rather than a uniquely human capacity. Copyright © 2014 Elsevier B.V. All rights reserved.
The Development of Statistics Textbook Supported with ICT and Portfolio-Based Assessment
NASA Astrophysics Data System (ADS)
Hendikawati, Putriaji; Yuni Arini, Florentina
2016-02-01
This research was development research that aimed to develop and produce a Statistics textbook model that supported with information and communication technology (ICT) and Portfolio-Based Assessment. This book was designed for students of mathematics at the college to improve students’ ability in mathematical connection and communication. There were three stages in this research i.e. define, design, and develop. The textbooks consisted of 10 chapters which each chapter contains introduction, core materials and include examples and exercises. The textbook developed phase begins with the early stages of designed the book (draft 1) which then validated by experts. Revision of draft 1 produced draft 2 which then limited test for readability test book. Furthermore, revision of draft 2 produced textbook draft 3 which simulated on a small sample to produce a valid model textbook. The data were analysed with descriptive statistics. The analysis showed that the Statistics textbook model that supported with ICT and Portfolio-Based Assessment valid and fill up the criteria of practicality.
High cumulants of conserved charges and their statistical uncertainties
NASA Astrophysics Data System (ADS)
Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu
2017-10-01
We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)
Detector noise statistics in the non-linear regime
NASA Technical Reports Server (NTRS)
Shopbell, P. L.; Bland-Hawthorn, J.
1992-01-01
The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.
Statistical learning and language acquisition
Romberg, Alexa R.; Saffran, Jenny R.
2011-01-01
Human learners, including infants, are highly sensitive to structure in their environment. Statistical learning refers to the process of extracting this structure. A major question in language acquisition in the past few decades has been the extent to which infants use statistical learning mechanisms to acquire their native language. There have been many demonstrations showing infants’ ability to extract structures in linguistic input, such as the transitional probability between adjacent elements. This paper reviews current research on how statistical learning contributes to language acquisition. Current research is extending the initial findings of infants’ sensitivity to basic statistical information in many different directions, including investigating how infants represent regularities, learn about different levels of language, and integrate information across situations. These current directions emphasize studying statistical language learning in context: within language, within the infant learner, and within the environment as a whole. PMID:21666883
Sangamesh, N C; Vidya, K C; Pathi, Jugajyoti; Singh, Arpita
2017-01-01
To assess the awareness, attitude, and knowledge about basic life support (BLS) among medical, dental, and nursing students and faculties and the proposal of BLS skills in the academic curriculum of undergraduate (UG) course. Recognition, prevention, and effective management of life-threatening emergencies are the responsibility of health-care professionals. These situations can be successfully managed by proper knowledge and training of the BLS skills. These life-saving maneuvers can be given through the structured resuscitation programs, which are lacking in the academic curriculum. A questionnaire study consisting of 20 questions was conducted among 659 participants in the Kalinga Institute of Dental Sciences, Kalinga Institute of Medical Sciences, KIIT University. Medical junior residents, BDS faculties, interns, nursing faculties, and 3 rd -year and final-year UG students from both medical and dental colleges were chosen. The statistical analysis was carried out using SPSS software version 20.0 (Armonk, NY:IBM Corp). After collecting the data, the values were statistically analyzed and tabulated. Statistical analysis was performed using Mann-Whitney U-test. The results with P < 0.05 were considered statistically significant. Our participants were aware of BLS, showed positive attitude toward it, whereas the knowledge about BLS was lacking, with the statistically significant P value. By introducing BLS regularly in the academic curriculum and by routine hands on workshops, all the health-care providers should be well versed with the BLS skills for effectively managing the life-threatening emergencies.
Effect of between-category similarity on basic-level superiority in pigeons
Lazareva, Olga F.; Soto, Fabián A.; Wasserman, Edward A.
2010-01-01
Children categorize stimuli at the basic level faster than at the superordinate level. We hypothesized that between-category similarity may affect this basic-level superiority effect. Dissimilar categories may be easy to distinguish at the basic level but be difficult to group at the superordinate level, whereas similar categories may be easy to group at the superordinate level but be difficult to distinguish at the basic level. Consequently, similar basic-level categories may produce a superordinate-before-basic learning trend, whereas dissimilar basic-level categories may result in a basic-before-superordinate learning trend. We tested this hypothesis in pigeons by constructing superordinate-level categories out of basic-level categories with known similarity. In Experiment 1, we experimentally evaluated the between-category similarity of four basic-level photographic categories using multiple fixed interval-extinction training (Astley & Wasserman, 1992). We used the resultant similarity matrices in Experiment 2 to construct two superordinate-level categories from basic-level categories with high between-category similarity (cars and persons; chairs and flowers). We then trained pigeons to concurrently classify those photographs into either the proper basic-level category or the proper superordinate-level category. Under these conditions, the pigeons learned the superordinate-level discrimination faster than the basic-level discrimination, confirming our hypothesis that basic-level superiority is affected by between-category similarity. PMID:20600696
Adaptive statistical pattern classifiers for remotely sensed data
NASA Technical Reports Server (NTRS)
Gonzalez, R. C.; Pace, M. O.; Raulston, H. S.
1975-01-01
A technique for the adaptive estimation of nonstationary statistics necessary for Bayesian classification is developed. The basic approach to the adaptive estimation procedure consists of two steps: (1) an optimal stochastic approximation of the parameters of interest and (2) a projection of the parameters in time or position. A divergence criterion is developed to monitor algorithm performance. Comparative results of adaptive and nonadaptive classifier tests are presented for simulated four dimensional spectral scan data.
Increasing Effectiveness and Efficiency Through Risk-Based Deployments
2015-12-01
Shaw and Henry McKay, both University of Chicago professors, began using maps to understand juvenile delinquency better in Chicago, IL.36 In the...André-Michel Guerry’s Ordonnateur Statistique: The First Statistical Calculator?,” The American Statistician 66, no. 3 (August 1, 2012): 195–200...micro or macro levels using basic inferential statistics .”91 5. Protecting Civil Rights and Liberties It is also important to note that a risk
1988-12-01
a computer simulation for a small value of r .................................... 25 Figure 5. A typical pulse shape for r = 8192...26 Figure 6. Pulse duration as function of r from the statistical simulations , assuming a spontaneous lifetime of 1 s...scaling factor from the statistical simulations ................. 29 Figure 10. Basic pulse characteristics and associated Bloch vector angles for the
Transfer of SIMNET Training in the Armor Officer Basic Course
1991-01-01
group correctly performed more tasks in the posttest , but the difference was not statistically significant for these small samples. Gains from pretest ...to posttest were not compared statistically, but the field-trained group showed little average gain. Based on these results and other supporting data...that serve as a control group , and (b) SIMNET classes after the change that serve as a treatment group . The comparison is termed quasi - experimental
Statistical Characteristics of Single Sort of Grape Bulgarian Wines
NASA Astrophysics Data System (ADS)
Boyadzhiev, D.
2008-10-01
The aim of this paper is to evaluate the differences in the values of the 8 basic physicochemical indices of single sort of grape Bulgarian wines (white and red ones), obligatory for the standardization of ready production in the winery. Statistically significant differences in the values of various sorts and vintages are established and possibilities for identifying the sort and the vintage on the base of these indices by applying discriminant analysis are discussed.
Assembly of Ultra-Dense Nanowire-Based Computing Systems
2006-06-30
34* characterized basic device element properties and statistics "* demonstrated product of sums (POS) validating assembled 2-bit adder structures " Demonstrated...linear region (Vds= 10 mV) from the peak g = 3 jiS at IVg -VTI= 0.13 V using the charge control model, representsmore than a factor of 10 improvement over...disrupted by ionizing particles or thermal fluctuation. Further, when working with such small charges, it is statistically possible that logic
Code of Federal Regulations, 2010 CFR
2010-01-01
... either a fee simple interest or life estate interest in the farm for which FSA established a farm basic... Transition Payment Producer Contract, a Tobacco Transition Payment Quota Holder Successor In Interest Contract, or a Tobacco Transition Payment Producer Successor In Interest Contract. Contract payment means a...
[Bayesian statistics in medicine -- part II: main applications and inference].
Montomoli, C; Nichelatti, M
2008-01-01
Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.
Effect of Silica Fume on two-stage Concrete Strength
NASA Astrophysics Data System (ADS)
Abdelgader, H. S.; El-Baden, A. S.
2015-11-01
Two-stage concrete (TSC) is an innovative concrete that does not require vibration for placing and compaction. TSC is a simple concept; it is made using the same basic constituents as traditional concrete: cement, coarse aggregate, sand and water as well as mineral and chemical admixtures. As its name suggests, it is produced through a two-stage process. Firstly washed coarse aggregate is placed into the formwork in-situ. Later a specifically designed self compacting grout is introduced into the form from the lowest point under gravity pressure to fill the voids, cementing the aggregate into a monolith. The hardened concrete is dense, homogeneous and has in general improved engineering properties and durability. This paper presents the results from a research work attempt to study the effect of silica fume (SF) and superplasticizers admixtures (SP) on compressive and tensile strength of TSC using various combinations of water to cement ratio (w/c) and cement to sand ratio (c/s). Thirty six concrete mixes with different grout constituents were tested. From each mix twenty four standard cylinder samples of size (150mm×300mm) of concrete containing crushed aggregate were produced. The tested samples were made from combinations of w/c equal to: 0.45, 0.55 and 0.85, and three c/s of values: 0.5, 1 and 1.5. Silica fume was added at a dosage of 6% of weight of cement, while superplasticizer was added at a dosage of 2% of cement weight. Results indicated that both tensile and compressive strength of TSC can be statistically derived as a function of w/c and c/s with good correlation coefficients. The basic principle of traditional concrete, which says that an increase in water/cement ratio will lead to a reduction in compressive strength, was shown to hold true for TSC specimens tested. Using a combination of both silica fume and superplasticisers caused a significant increase in strength relative to control mixes.
Critical thinking skills of basic baccalaureate and Accelerated second-degree nursing students.
Newton, Sarah E; Moore, Gary
2013-01-01
The purpose of this study was to describe the critical thinking (CT) skills of basic baccalaureate (basic-BSN) and accelerated second-degree (ASD) nursing students at nursing program entry. Many authors propose that CT in nursing should be viewed as a developmental process that increases as students' experiences with it change. However, there is a dearth of literature that describes basic-BSN and ASD students' CT skills from an evolutionary perspective. The study design was exploratory descriptive. The results indicated thatASD students had higher CT scores on a quantitative critical thinking assessment at program entry than basic-BSN students. CT data are needed across the nursing curriculum from basic-BSN and ASD students in order for nurse educators to develop cohort-specific pedagogical approaches that facilitate critical thinking in nursing and produce nurses with good CT skills for the future.
Basic Sciences Fertilizing Clinical Microbiology and Infection Management.
Baquero, Fernando
2017-08-15
Basic sciences constitute the most abundant sources of creativity and innovation, as they are based on the passion of knowing. Basic knowledge, in close and fertile contact with medical and public health needs, produces distinct advancements in applied sciences. Basic sciences play the role of stem cells, providing material and semantics to construct differentiated tissues and organisms and enabling specialized functions and applications. However, eventually processes of "practice deconstruction" might reveal basic questions, as in de-differentiation of tissue cells. Basic sciences, microbiology, infectious diseases, and public health constitute an epistemological gradient that should also be an investigational continuum. The coexistence of all these interests and their cross-fertilization should be favored by interdisciplinary, integrative research organizations working simultaneously in the analytical and synthetic dimensions of scientific knowledge. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
ERIC Educational Resources Information Center
Allswang, John M.
1986-01-01
This article provides two short microcomputer gradebook programs. The programs, written in BASIC for the IBM-PC and Apple II, provide statistical information about class performance and calculate grades either on a normal distribution or based on teacher-defined break points. (JDH)
Urban Data Book : Volume 1. Urban Data - Atlanta-Miami
DOT National Transportation Integrated Search
1975-11-01
A quick reference compilation of certain population, socio-economic, employment, and modal split characteristics of the 35 largest Standard Metropolitan Statistical Areas (SMSA) in the United States is presented. The three basic groups or urban data ...
NASA Astrophysics Data System (ADS)
Goldbery, R.; Tehori, O.
SEDPAK provides a comprehensive software package for operation of a settling tube and sand analyzer (2-0.063 mm) and includes data-processing programs for statistical and graphic output of results. The programs are menu-driven and written in APPLESOFT BASIC, conforming with APPLE 3.3 DOS. Data storage and retrieval from disc is an important feature of SEDPAK. Additional features of SEDPAK include condensation of raw settling data via standard size-calibration curves to yield statistical grain-size parameters, plots of grain-size frequency distributions and cumulative log/probability curves. The program also has a module for processing of grain-size frequency data from sieved samples. An addition feature of SEDPAK is the option for automatic data processing and graphic output of a sequential or nonsequential array of samples on one side of a disc.
Hezel, Marcus; von Usslar, Kathrin; Kurzweg, Thiemo; Lörincz, Balazs B; Knecht, Rainald
2016-04-01
This article reviews the methodical and statistical basics of designing a trial, with a special focus on the process of defining and choosing endpoints and cutpoints as the foundations of clinical research, and ultimately that of evidence-based medicine. There has been a significant progress in the treatment of head and neck cancer in the past few decades. Currently available treatment options can have a variety of different goals, depending e.g. on tumor stage, among other factors. The outcome of a specific treatment in clinical trials is measured using endpoints. Besides classical endpoints, such as overall survival or organ preservation, other endpoints like quality of life are becoming increasingly important in designing and conducting a trial. The present work is based on electronic research and focuses on the solid methodical and statistical basics of a clinical trial, on the structure of study designs and on the presentation of various endpoints.
NASA Astrophysics Data System (ADS)
Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.
2018-01-01
We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.
Inhalers and nebulizers: basic principles and preliminary measurements
NASA Astrophysics Data System (ADS)
Misik, Ondrej; Lizal, Frantisek; Asl, Vahid Farhikhteh; Belka, Miloslav; Jedelsky, Jan; Elcner, Jakub; Jicha, Miroslav
2018-06-01
Inhalers are hand-held devices which are used for administration of therapeutic aerosols via inhalation. Nebulizers are larger devices serving for home and hospital care using inhaled medication. This contribution describes the basic principles of dispersion of aerosol particles used in various types of inhalers and nebulizers, and lists the basic physical mechanisms contributing to the deposition of inhaled particles in the human airways. The second part of this article presents experimental setup, methodology and preliminary results of particle size distributions produced by several selected inhalers and nebulizers.
Pennington, Kyla; McGregor, Emma; Beasley, Clare L; Everall, Ian; Cotter, David; Dunn, Michael J
2004-01-01
A major cause of poor resolution in the alkaline pH range of two-dimensional electrophoresis (2-DE) gels is unsatisfactory separation of basic proteins in the first dimension. We have compared methods for the separation of basic proteins in the isoelectric focusing dimension of human brain proteins. The combined use of anodic cup-loading and the hydroxyethyldisulphide containing solution (DeStreak) produced better resolution in both analytical and micropreparative protein loaded 2-DE gels than the other methods investigated.
The use of simulation in teaching the basic sciences.
Eason, Martin P
2013-12-01
To assess the current use of simulation in medical education, specifically, the teaching of the basic sciences to accomplish the goal of improved integration. Simulation is increasingly being used by the institutions to teach the basic sciences. Preliminary data suggest that it is an effective tool with increased retention and learner satisfaction. Medical education is undergoing tremendous change. One of the directions of that change is increasing integration of the basic and clinical sciences to improve the efficiency and quality of medical education, and ultimately to improve the patient care. Integration is thought to improve the understanding of basic science conceptual knowledge and to better prepare the learners for clinical practice. Simulation because of its unique effects on learning is currently being successfully used by many institutions as a means to produce that integration through its use in the teaching of the basic sciences. Preliminary data indicate that simulation is an effective tool for basic science education and garners high learner satisfaction.
Janssen, Stefan; Meyer, Andreas; Vordermark, Dirk; Steinmann, Diana
2010-12-01
the internet as a source of medical information has emerged during the last years. There is a confusing amount of medical websites with a great diversity of quality. Websites of radiotherapy institutions could offer a safe and an easy-to-control way to assist patients' requests. 205 internet appearances of German radiotherapy institutions were analyzed in June 2009 (nonuniversity hospitals n = 108, medical practices n = 62, university hospitals n = 35). For the evaluation of each homepage verifiable criteria concerning basic information, service and medical issues were used. the quality of information published via internet by different radiotherapy institutions showed a large variety. Basic information like telephone numbers, operating hours, and direction guidance were provided in 96.7%, 40%, and 50.7%, respectively. 85% of the websites introduced the staff, 50.2% supplied photos and 14% further information on the attending physicians. The mean amount of continuative links to other websites was 5.4, the mean amount of articles supplying medical information for patients summed up to 4.6. Medical practices and university hospitals had statistically significant more informative articles and links to other websites than nonuniversity hospitals. No statistically significant differences could be found in most other categories like service issues and basic information. internet presences of radiotherapy institutions hold the chance to supply patients with professional and individualized medical information. While some websites are already using this opportunity, others show a lack of basic information or of user-friendliness.
Vertical integration of basic science in final year of medical education.
Rajan, Sudha Jasmine; Jacob, Tripti Meriel; Sathyendra, Sowmya
2016-01-01
Development of health professionals with ability to integrate, synthesize, and apply knowledge gained through medical college is greatly hampered by the system of delivery that is compartmentalized and piecemeal. There is a need to integrate basic sciences with clinical teaching to enable application in clinical care. To study the benefit and acceptance of vertical integration of basic science in final year MBBS undergraduate curriculum. After Institutional Ethics Clearance, neuroanatomy refresher classes with clinical application to neurological diseases were held as part of the final year posting in two medical units. Feedback was collected. Pre- and post-tests which tested application and synthesis were conducted. Summative assessment was compared with the control group of students who had standard teaching in other two medical units. In-depth interview was conducted on 2 willing participants and 2 teachers who did neurology bedside teaching. Majority (>80%) found the classes useful and interesting. There was statistically significant improvement in the post-test scores. There was a statistically significant difference between the intervention and control groups' scores during summative assessment (76.2 vs. 61.8 P < 0.01). Students felt that it reinforced, motivated self-directed learning, enabled correlations, improved understanding, put things in perspective, gave confidence, aided application, and enabled them to follow discussions during clinical teaching. Vertical integration of basic science in final year was beneficial and resulted in knowledge gain and improved summative scores. The classes were found to be useful, interesting and thought to help in clinical care and application by majority of students.
ERIC Educational Resources Information Center
Soroker, N.; Kasher, A.; Giora, R.; Batori, G.; Corn, C.; Gil, M.; Zaidel, E.
2005-01-01
We examined the effect of localized brain lesions on processing of the basic speech acts (BSAs) of question, assertion, request, and command. Both left and right cerebral damage produced significant deficits relative to normal controls, and left brain damaged patients performed worse than patients with right-sided lesions. This finding argues…
Gender Issues in Basic Education and Vocational Training. The Gender Manual Series.
ERIC Educational Resources Information Center
Anderson, Mary B.
When and how to integrate girls and women into Agency for International Development (A.I.D.) projects in basic education and vocational training is the focus of this manual. The volume generally follows the format developed in the Topical Reference Guide series produced by A.I.D.'s Center for Development Information and Evaluation (CDIE). The…
NASA Astrophysics Data System (ADS)
Manea, L. R.; Hristian, L.; Leon, A. L.; Popa, A.
2016-08-01
The most important applications of electrospun polymeric nanofibers are by far those from biomedical field. From the biological point of view, almost all the human tissues and organs consist of nanofibroas structures. The examples include the bone, dentine, cartilage, tendons and skin. All these are characterized through different fibrous structures, hierarchically organized at nanometer scale. Electrospinning represents one of the nanotechnologies that permit to obtain such structures for cell cultures, besides other technologies, such as selfassembling and phase separation technologies. The basic materials used to produce electrospun nanofibers can be natural or synthetic, having polymeric, ceramic or composite nature. These materials are selected depending of the nature and structure of the tissue meant to be regenerated, namely: for the regeneration of smooth tissues regeneration one needs to process through electrospinning polymeric basic materials, while in order to obtain the supports for the regeneration of hard tissues one must mainly use ceramic materials or composite structures that permit imbedding the bioactive substances in distinctive zones of the matrix. This work presents recent studies concerning basic materials used to obtain electrospun polymeric nanofibers, and real possibilities to produce and implement these nanofibers in medical bioengineering applications.
Sounds Exaggerate Visual Shape
ERIC Educational Resources Information Center
Sweeny, Timothy D.; Guzman-Martinez, Emmanuel; Ortega, Laura; Grabowecky, Marcia; Suzuki, Satoru
2012-01-01
While perceiving speech, people see mouth shapes that are systematically associated with sounds. In particular, a vertically stretched mouth produces a /woo/ sound, whereas a horizontally stretched mouth produces a /wee/ sound. We demonstrate that hearing these speech sounds alters how we see aspect ratio, a basic visual feature that contributes…
Is the Interculturalisation of Chile's Universities a Real Possibility?
ERIC Educational Resources Information Center
Williamson, Guillermo
2017-01-01
Knowledge deemed worthy of classification as "truth" is not produced only in classic positivist research, or research recognised by the official state accreditation system, it is also produced by research--action, research + development, experimentation and systematisation. One of the most basic aspects of academic work is…
Heterogeneity and scale of sustainable development in cities.
Brelsford, Christa; Lobo, José; Hand, Joe; Bettencourt, Luís M A
2017-08-22
Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals.
Heterogeneity and scale of sustainable development in cities
Brelsford, Christa; Lobo, José; Hand, Joe
2017-01-01
Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals. PMID:28461489
Double-row vs single-row rotator cuff repair: a review of the biomechanical evidence.
Wall, Lindley B; Keener, Jay D; Brophy, Robert H
2009-01-01
A review of the current literature will show a difference between the biomechanical properties of double-row and single-row rotator cuff repairs. Rotator cuff tears commonly necessitate surgical repair; however, the optimal technique for repair continues to be investigated. Recently, double-row repairs have been considered an alternative to single-row repair, allowing a greater coverage area for healing and a possibly stronger repair. We reviewed the literature of all biomechanical studies comparing double-row vs single-row repair techniques. Inclusion criteria included studies using cadaveric, animal, or human models that directly compared double-row vs single-row repair techniques, written in the English language, and published in peer reviewed journals. Identified articles were reviewed to provide a comprehensive conclusion of the biomechanical strength and integrity of the repair techniques. Fifteen studies were identified and reviewed. Nine studies showed a statistically significant advantage to a double-row repair with regards to biomechanical strength, failure, and gap formation. Three studies produced results that did not show any statistical advantage. Five studies that directly compared footprint reconstruction all demonstrated that the double-row repair was superior to a single-row repair in restoring anatomy. The current literature reveals that the biomechanical properties of a double-row rotator cuff repair are superior to a single-row repair. Basic Science Study, SRH = Single vs. Double Row RCR.
NASA Astrophysics Data System (ADS)
Stapp, Henry P.
2011-11-01
The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determined by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.
Paré, Pierre; Lee, Joanna; Hawes, Ian A
2010-03-01
To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician's discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management.
Paré, Pierre; Math, Joanna Lee M; Hawes, Ian A
2010-01-01
OBJECTIVE: To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. METHODS: Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician’s discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. RESULTS: A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). CONCLUSIONS: A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management. PMID:20352148
Feasibility of digital image colorimetry--application for water calcium hardness determination.
Lopez-Molinero, Angel; Tejedor Cubero, Valle; Domingo Irigoyen, Rosa; Sipiera Piazuelo, Daniel
2013-01-15
Interpretation and relevance of basic RGB colors in Digital Image-Based Colorimetry have been treated in this paper. The studies were carried out using the chromogenic model formed by the reaction between Ca(II) ions and glyoxal bis(2-hydroxyanil). It produced orange-red colored solutions in alkaline media. Individual basic color data (RGB) and also the total intensity of colors, I(tot), were the original variables treated by Factorial Analysis. Te evaluation evidenced that the highest variance of the system and the highest analytical sensitivity were associated to the G color. However, after the study by Fourier transform the basic R color was recognized as an important feature in the information. It was manifested as an intrinsic characteristic that appeared differentiated in terms of low frequency in Fourier transform. The Principal Components Analysis study showed that the variance of the system could be mostly retained in the first principal component, but was dependent on all basic colors. The colored complex was also applied and validated as a Digital Image Colorimetric method for the determination of Ca(II) ions. RGB intensities were linearly correlated with Ca(II) in the range 0.2-2.0 mg L(-1). In the best conditions, using green color, a simple and reliable method for Ca determination could be developed. Its detection limit was established (criterion 3s) as 0.07 mg L(-1). And the reproducibility was lower than 6%, for 1.0 mg L(-1) Ca. Other chromatic parameters were evaluated as dependent calibration variables. Their representativeness, variance and sensitivity were discussed in order to select the best analytical variable. The potentiality of the procedure as a field and ready-to-use method, susceptible to be applied 'in situ' with a minimum of experimental needs, was probed. Applications of the analysis of Ca in different real water samples were carried out. Water of the city net, mineral bottled, and natural-river were analyzed and results were compared and evaluated statistically. The validity was assessed by the alternative techniques of flame atomic absorption spectroscopy and titrimetry. Differences were appreciated but they were consistent with the applied methods. Copyright © 2012 Elsevier B.V. All rights reserved.
SPAGETTA: a Multi-Purpose Gridded Stochastic Weather Generator
NASA Astrophysics Data System (ADS)
Dubrovsky, M.; Huth, R.; Rotach, M. W.; Dabhi, H.
2017-12-01
SPAGETTA is a new multisite/gridded multivariate parametric stochastic weather generator (WG). Site-specific precipitation occurrence and amount are modelled by Markov chain and Gamma distribution, the non-precipitation variables are modelled by an autoregressive (AR) model conditioned on precipitation occurrence, and the spatial coherence of all variables is modelled following the Wilks' (2009) approach. SPAGETTA may be run in two modes. Mode 1: it is run as a classical WG, which is calibrated using weather series from multiple sites, and only then it may produce arbitrarily long synthetic series mimicking the spatial and temporal structure of the calibration data. To generate the weather series representing the future climate, the WG parameters are modified according to the climate change scenario, typically derived from GCM or RCM simulations. Mode 2: the user provides only basic information (not necessarily to be realistic) on the temporal and spatial auto-correlation structure of the weather variables and their mean annual cycle; the generator itself derives the parameters of the underlying AR model, which produces the multi-site weather series. Optionally, the user may add the spatially varying trend, which is superimposed to the synthetic series. The contribution consists of following parts: (a) Model of the WG. (b) Validation of WG in terms of the spatial temperature and precipitation characteristics, including characteristics of spatial hot/cold/dry/wet spells. (c) Results of the climate change impact experiment, in which the WG parameters representing the spatial and temporal variability are modified using the climate change scenarios and the effect on the above spatial validation indices is analysed. In this experiment, the WG is calibrated using the E-OBS gridded daily weather data for several European regions, and the climate change scenarios are derived from the selected RCM simulations (CORDEX database). (d) The second mode of operation will be demonstrated by results obtained while developing the methodology for assessing collective significance of trends in multi-site weather series. The performance of the proposed test statistics is assessed based on large number of realisations of synthetic series produced by WG assuming a given statistical structure and trend of the weather series.
Multispectral Terrain Background Simulation Techniques For Use In Airborne Sensor Evaluation
NASA Astrophysics Data System (ADS)
Weinberg, Michael; Wohlers, Ronald; Conant, John; Powers, Edward
1988-08-01
A background simulation code developed at Aerodyne Research, Inc., called AERIE is designed to reflect the major sources of clutter that are of concern to staring and scanning sensors of the type being considered for various airborne threat warning (both aircraft and missiles) sensors. The code is a first principles model that could be used to produce a consistent image of the terrain for various spectral bands, i.e., provide the proper scene correlation both spectrally and spatially. The code utilizes both topographic and cultural features to model terrain, typically from DMA data, with a statistical overlay of the critical underlying surface properties (reflectance, emittance, and thermal factors) to simulate the resulting texture in the scene. Strong solar scattering from water surfaces is included with allowance for wind driven surface roughness. Clouds can be superimposed on the scene using physical cloud models and an analytical representation of the reflectivity obtained from scattering off spherical particles. The scene generator is augmented by collateral codes that allow for the generation of images at finer resolution. These codes provide interpolation of the basic DMA databases using fractal procedures that preserve the high frequency power spectral density behavior of the original scene. Scenes are presented illustrating variations in altitude, radiance, resolution, material, thermal factors, and emissivities. The basic models utilized for simulation of the various scene components and various "engineering level" approximations are incorporated to reduce the computational complexity of the simulation.
Recent trends in publications of US vascular surgery program directors.
Hingorani, Anil; DerDerian, Trevor; Gallagher, James; Ascher, Enrico
2014-08-01
We reviewed the number of vascular publications listed in PubMed from 2001 to 2009 for US program directors in vascular surgery and suggest that this can be used as a benchmark. PubMed listed 3284 citations published during this time period. The average number of citations in PubMed per program director was 3.68 per year. The top third produced 67% of the publications. Journal of Vascular Surgery publications made up 37%. No statistical differences could be ascertained between the regions of the country and the number of publications. Compared to the first six years, the number of citations decreased during the last three years (13%). During the first period, there were no programs with no publications and seven with no Journal of Vascular Surgery publication. During the last three years, there were seven programs with no publications and 19 programs with no Journal of Vascular Surgery publications. The number of aortic-endovascular citations peaked in 2002 and 2003, while the number of open and basic science citations decreased. Imaging citations peaked in 2003-2005, and carotid-endovascular, vein-endovascular, and thoracic aortic-endovascular citations climbed. The decrease in the number of citations/program/year raises concern about the level of academic activity in vascular surgery. Overall, the annual distribution of the topic of these citations represents a continued shift from open to endovascular cases and decreasing basic science citations. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
ERIC Educational Resources Information Center
Joyner, Jeane; Leiva, Miriam
1988-01-01
Plastic Easter eggs are useful devices for teaching basic mathematics skills, from counting activities to graphing. Eggs are used to reinforce addition, subtraction, and multiplication skills; column addition, estimation, statistics, and other topics are introduced. Sample activities are described. (JL)
ERIC Educational Resources Information Center
Guerin, Stephen M.; Guerin, Clark L.
1979-01-01
Discusses a phenomenon called Extrasensory Perception (ESP) whereby information is gained directly by the mind without the use of the ordinary senses. Experiments in ESP and the basic equipment and methods are presented. Statistical evaluation of ESP experimental results are also included. (HM)
Risk of Bacterial Meningitis in Children with Cochlear Implants
... Hearing Loss Homepage Basics Noise-Induced Hearing Loss Genetics of Hearing Loss Screening & Diagnosis Types of Hearing Loss About Sound Treatment & Intervention Services Learning Language Bacterial Meningitis Studies Data & Statistics EHDI Annual Data 2016 2015 2014 2013 ...
Future Newspaper Managers Learn Basics at Oregon.
ERIC Educational Resources Information Center
Halverson, Roy
1978-01-01
Describes an experimental program that prepares students for careers in newspaper management with a sequence of courses in journalism, accounting, marketing, management, finance, and statistics, ending with an internship in the business office of a daily or weekly newspaper. (RL)
Statistics Quality Control Statistics CIDR is dedicated to producing the highest quality data for our investigators. These cumulative quality control statistics are based on data from 419 released CIDR Program
The Emergence of Contextual Social Psychology.
Pettigrew, Thomas F
2018-07-01
Social psychology experiences recurring so-called "crises." This article maintains that these episodes actually mark advances in the discipline; these "crises" have enhanced relevance and led to greater methodological and statistical sophistication. New statistical tools have allowed social psychologists to begin to achieve a major goal: placing psychological phenomena in their larger social contexts. This growing trend is illustrated with numerous recent studies; they demonstrate how cultures and social norms moderate basic psychological processes. Contextual social psychology is finally emerging.
GenASiS Basics: Object-oriented utilitarian functionality for large-scale physics simulations
Cardall, Christian Y.; Budiardja, Reuben D.
2015-06-11
Aside from numerical algorithms and problem setup, large-scale physics simulations on distributed-memory supercomputers require more basic utilitarian functionality, such as physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. Here we describe and make available Fortran 2003 classes furnishing extensible object-oriented implementations of this sort of rudimentary functionality, along with individual `unit test' programs and larger example problems demonstrating their use. Lastly, these classes compose the Basics division of our developing astrophysics simulation code GenASiS (General Astrophysical Simulation System), but their fundamental nature makes themmore » useful for physics simulations in many fields.« less
Generalized Models for Rock Joint Surface Shapes
Du, Shigui; Hu, Yunjin; Hu, Xiaofei
2014-01-01
Generalized models of joint surface shapes are the foundation for mechanism studies on the mechanical effects of rock joint surface shapes. Based on extensive field investigations of rock joint surface shapes, generalized models for three level shapes named macroscopic outline, surface undulating shape, and microcosmic roughness were established through statistical analyses of 20,078 rock joint surface profiles. The relative amplitude of profile curves was used as a borderline for the division of different level shapes. The study results show that the macroscopic outline has three basic features such as planar, arc-shaped, and stepped; the surface undulating shape has three basic features such as planar, undulating, and stepped; and the microcosmic roughness has two basic features such as smooth and rough. PMID:25152901
[Statistical analysis using freely-available "EZR (Easy R)" software].
Kanda, Yoshinobu
2015-10-01
Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.
Alternative Fuels Data Center: Natural Gas Fuel Basics
-derived natural gas, renewable natural gas-which is produced from decaying organic materials-must be on organic materials. Alternatively, renewable natural gas (RNG), also known as biomethane, is produced from organic materials-such as waste from landfills and livestock-through anaerobic digestion. RNG
ERIC Educational Resources Information Center
Roy, Ken
2010-01-01
Batteries commonly used in flashlights and other household devices produce hydrogen gas as a product of zinc electrode corrosion. The amount of gas produced is affected by the batteries' design and charge rate. Dangerous levels of hydrogen gas can be released if battery types are mixed, batteries are damaged, batteries are of different ages, or…
12 CFR 618.8325 - Disclosure of loan documents.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Loan means a loan made to a farmer, rancher, or producer or harvester of aquatic products, for any agricultural or aquatic purpose and other credit needs of the borrower, including financing for basic..., ranchers, and producers or harvesters of aquatic products. (4) Loan contract means any written agreement...
12 CFR 618.8325 - Disclosure of loan documents.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Loan means a loan made to a farmer, rancher, or producer or harvester of aquatic products, for any agricultural or aquatic purpose and other credit needs of the borrower, including financing for basic..., ranchers, and producers or harvesters of aquatic products. (4) Loan contract means any written agreement...
Extensive Air Showers in the Classroom
ERIC Educational Resources Information Center
Badala, A.; Blanco, F.; La Rocca, P.; Pappalardo, G. S.; Pulvirenti, A.; Riggi, F.
2007-01-01
The basic properties of extensive air showers of particles produced in the interaction of a high-energy primary cosmic ray in the Earth's atmosphere are discussed in the context of educational cosmic ray projects involving undergraduate students and high-school teams. Simulation results produced by an air shower development code were made…
Jager, Tjalling
2013-02-05
The individuals of a species are not equal. These differences frustrate experimental biologists and ecotoxicologists who wish to study the response of a species (in general) to a treatment. In the analysis of data, differences between model predictions and observations on individual animals are usually treated as random measurement error around the true response. These deviations, however, are mainly caused by real differences between the individuals (e.g., differences in physiology and in initial conditions). Understanding these intraspecies differences, and accounting for them in the data analysis, will improve our understanding of the response to the treatment we are investigating and allow for a more powerful, less biased, statistical analysis. Here, I explore a basic scheme for statistical inference to estimate parameters governing stress that allows individuals to differ in their basic physiology. This scheme is illustrated using a simple toxicokinetic-toxicodynamic model and a data set for growth of the springtail Folsomia candida exposed to cadmium in food. This article should be seen as proof of concept; a first step in bringing more realism into the statistical inference for process-based models in ecotoxicology.
Czerniecki, Joseph M; Turner, Aaron P; Williams, Rhonda M; Thompson, Mary Lou; Landry, Greg; Hakimi, Kevin; Speckman, Rebecca; Norvell, Daniel C
2017-01-01
The objective of this study was the development of AMPREDICT-Mobility, a tool to predict the probability of independence in either basic or advanced (iBASIC or iADVANCED) mobility 1 year after dysvascular major lower extremity amputation. Two prospective cohort studies during consecutive 4-year periods (2005-2009 and 2010-2014) were conducted at seven medical centers. Multiple demographic and biopsychosocial predictors were collected in the periamputation period among individuals undergoing their first major amputation because of complications of peripheral arterial disease or diabetes. The primary outcomes were iBASIC and iADVANCED mobility, as measured by the Locomotor Capabilities Index. Combined data from both studies were used for model development and internal validation. Backwards stepwise logistic regression was used to develop the final prediction models. The discrimination and calibration of each model were assessed. Internal validity of each model was assessed with bootstrap sampling. Twelve-month follow-up was reached by 157 of 200 (79%) participants. Among these, 54 (34%) did not achieve iBASIC mobility, 103 (66%) achieved at least iBASIC mobility, and 51 (32%) also achieved iADVANCED mobility. Predictive factors associated with reduced odds of achieving iBASIC mobility were increasing age, chronic obstructive pulmonary disease, dialysis, diabetes, prior history of treatment for depression or anxiety, and very poor to fair self-rated health. Those who were white, were married, and had at least a high-school degree had a higher probability of achieving iBASIC mobility. The odds of achieving iBASIC mobility increased with increasing body mass index up to 30 kg/m 2 and decreased with increasing body mass index thereafter. The prediction model of iADVANCED mobility included the same predictors with the exception of diabetes, chronic obstructive pulmonary disease, and education level. Both models showed strong discrimination with C statistics of 0.85 and 0.82, respectively. The mean difference in predicted probabilities for those who did and did not achieve iBASIC and iADVANCED mobility was 33% and 29%, respectively. Tests for calibration and observed vs predicted plots suggested good fit for both models; however, the precision of the estimates of the predicted probabilities was modest. Internal validation through bootstrapping demonstrated some overoptimism of the original model development, with the optimism-adjusted C statistic for iBASIC and iADVANCED mobility being 0.74 and 0.71, respectively, and the discrimination slope 19% and 16%, respectively. AMPREDICT-Mobility is a user-friendly prediction tool that can inform the patient undergoing a dysvascular amputation and the patient's provider about the probability of independence in either basic or advanced mobility at each major lower extremity amputation level. Copyright © 2016 Society for Vascular Surgery. All rights reserved.
Interpretation of statistical results.
García Garmendia, J L; Maroto Monserrat, F
2018-02-21
The appropriate interpretation of the statistical results is crucial to understand the advances in medical science. The statistical tools allow us to transform the uncertainty and apparent chaos in nature to measurable parameters which are applicable to our clinical practice. The importance of understanding the meaning and actual extent of these instruments is essential for researchers, the funders of research and for professionals who require a permanent update based on good evidence and supports to decision making. Various aspects of the designs, results and statistical analysis are reviewed, trying to facilitate his comprehension from the basics to what is most common but no better understood, and bringing a constructive, non-exhaustive but realistic look. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.
1978-12-01
Poisson processes . The method is valid for Poisson processes with any given intensity function. The basic thinning algorithm is modified to exploit several refinements which reduce computer execution time by approximately one-third. The basic and modified thinning programs are compared with the Poisson decomposition and gap-statistics algorithm, which is easily implemented for Poisson processes with intensity functions of the form exp(a sub 0 + a sub 1t + a sub 2 t-squared. The thinning programs are competitive in both execution
Multilevel modelling: Beyond the basic applications.
Wright, Daniel B; London, Kamala
2009-05-01
Over the last 30 years statistical algorithms have been developed to analyse datasets that have a hierarchical/multilevel structure. Particularly within developmental and educational psychology these techniques have become common where the sample has an obvious hierarchical structure, like pupils nested within a classroom. We describe two areas beyond the basic applications of multilevel modelling that are important to psychology: modelling the covariance structure in longitudinal designs and using generalized linear multilevel modelling as an alternative to methods from signal detection theory (SDT). Detailed code for all analyses is described using packages for the freeware R.
SPARSKIT: A basic tool kit for sparse matrix computations
NASA Technical Reports Server (NTRS)
Saad, Youcef
1990-01-01
Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.
NASA Astrophysics Data System (ADS)
Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio
2017-04-01
Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic and drought statistic of the historical data. A multi-objective analysis using basic statistics (mean, standard deviation and asymmetry coefficient) and droughts statistics (duration, magnitude and intensity) has been performed to identify which models are better in terms of goodness of fit to reproduce the historical series. The drought statistics have been obtained from the Standard Precipitation index (SPI) series using the Theory of Runs. This analysis allows discriminate the best RCM and the best combination of model and correction technique in the bias-correction method. We have also analyzed the possibilities of using different Stochastic Weather Generators to approximate the basic and droughts statistics of the historical series. These analyses have been performed in our case study in a lumped and in a distributed way in order to assess its sensibility to the spatial scale. The statistic of the future temperature series obtained with different ensemble options are quite homogeneous, but the precipitation shows a higher sensibility to the adopted method and spatial scale. The global increment in the mean temperature values are 31.79 %, 31.79 %, 31.03 % and 31.74 % for the distributed bias-correction, distributed delta-change, lumped bias-correction and lumped delta-change ensembles respectively and in the precipitation they are -25.48 %, -28.49 %, -26.42 % and -27.35% respectively. Acknowledgments: This research work has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02 and CORDEX projects for the data provided for this study and the R package qmap.
ERIC Educational Resources Information Center
Heston, Wilma
The three-volume set of materials describes and presents the results to date of a federally-funded project to develop Pashto-English and English-Pashto dictionaries. The goal was to produce a list of 12,000 basic Pashto words for English-speaking users. Words were selected based on frequency in various kinds of oral and written materials, and were…
Quality control analysis : part II : soil and aggregate base course.
DOT National Transportation Integrated Search
1966-07-01
This is the second of the three reports on the quality control analysis of highway construction materials. : It deals with the statistical evaluation of results from several construction projects to determine the basic pattern of variability with res...
Urban Data Book : Volume 2. Urban Data - Milwaukee-Washington, Notes and Technical Appendixes
DOT National Transportation Integrated Search
1975-11-01
A quick reference compilation of certain population, socio-economic, employment, and modal split characteristics of the 35 largest Standard Metropolitan Statistical Areas (SMSA) in the United States is presented. The three basic groups of urban data ...
Quality control analysis : part III : concrete and concrete aggregates.
DOT National Transportation Integrated Search
1966-11-01
This is the third and last report on the Quality Control Analysis of highway construction materials. : It deals with the statistical evaluation of data from several construction projects to determine the basic pattern of variability with respect to s...
Didierlaurent, Ludovic; Houzet, Laurent; Morichaud, Zakia; Darlix, Jean-Luc; Mougel, Marylène
2008-01-01
Reverse transcription of the genomic RNA by reverse transcriptase occurs soon after HIV-1 infection of target cells. The viral nucleocapsid (NC) protein chaperones this process via its nucleic acid annealing activities and its interactions with the reverse transcriptase enzyme. To function, NC needs its two conserved zinc fingers and flanking basic residues. We recently reported a new role for NC, whereby it negatively controls reverse transcription in the course of virus formation. Indeed, deleting its zinc fingers causes reverse transcription activation in virus producer cells. To investigate this new NC function, we used viruses with subtle mutations in the conserved zinc fingers and its flanking domains. We monitored by quantitative PCR the HIV-1 DNA content in producer cells and in produced virions. Results showed that the two intact zinc-finger structures are required for the temporal control of reverse transcription by NC throughout the virus replication cycle. The N-terminal basic residues also contributed to this new role of NC, while Pro-31 residue between the zinc fingers and Lys-59 in the C-terminal region did not. These findings further highlight the importance of NC as a major target for anti-HIV-1 drugs. PMID:18641038
Microbial production of biopolymers from the renewable resource wheat straw.
Gasser, E; Ballmann, P; Dröge, S; Bohn, J; König, H
2014-10-01
Production of poly-ß-hydroxybutyrate (PHB) and the chemical basic compound lactate from the agricultural crop 'wheat straw' as a renewable carbon resource. A thermal pressure hydrolysis procedure for the breakdown of wheat straw was applied. By this means, the wheat straw was converted into a partially solubilized hemicellulosic fraction, consisting of sugar monomers, and an insoluble cellulosic fraction, containing cellulose, lignin and a small portion of hemicellulose. The insoluble cellulosic fraction was further hydrolysed by commercial enzymes in monomers. The production of PHB from the sugar monomers originating from hemicellulose or cellulose was achieved by the isolates Bacillus licheniformis IMW KHC 3 and Bacillus megaterium IMW KNaC 2. The basic chemical compound, lactate, a starting compound for the production of polylactide (PLA), was formed by some heterofermentative lactic acid bacteria (LAB) able to grow with xylose from the hemicellulosic wheat straw hydrolysate. Two strains were selected which were able to produce PHB from the sugars both from the hemicellulosic and the cellulosic fraction of the wheat straw. In addition, some of the LAB tested were capable of producing lactate from the hemicellulosic hydrolysate. The renewable resource wheat straw could serve as a substrate for microbiologically produced basic chemicals and biodegradable plastics. © 2014 The Society for Applied Microbiology.
Densely calculated facial soft tissue thickness for craniofacial reconstruction in Chinese adults.
Shui, Wuyang; Zhou, Mingquan; Deng, Qingqiong; Wu, Zhongke; Ji, Yuan; Li, Kang; He, Taiping; Jiang, Haiyan
2016-09-01
Craniofacial reconstruction (CFR) is used to recreate a likeness of original facial appearance for an unidentified skull; this technique has been applied in both forensics and archeology. Many CFR techniques rely on the average facial soft tissue thickness (FSTT) of anatomical landmarks, related to ethnicity, age, sex, body mass index (BMI), etc. Previous studies typically employed FSTT at sparsely distributed anatomical landmarks, where different landmark definitions may affect the contrasting results. In the present study, a total of 90,198 one-to-one correspondence skull vertices are established on 171 head CT-scans and the FSTT of each corresponding vertex is calculated (hereafter referred to as densely calculated FSTT) for statistical analysis and CFR. Basic descriptive statistics (i.e., mean and standard deviation) for densely calculated FSTT are reported separately according to sex and age. Results show that 76.12% of overall vertices indicate that the FSTT is greater in males than females, with the exception of vertices around the zygoma, zygomatic arch and mid-lateral orbit. These sex-related significant differences are found at 55.12% of all vertices and the statistically age-related significant differences are depicted between the three age groups at a majority of all vertices (73.31% for males and 63.43% for females). Five non-overlapping categories are given and the descriptive statistics (i.e., mean, standard deviation, local standard deviation and percentage) are reported. Multiple appearances are produced using the densely calculated FSTT of various age and sex groups, and a quantitative assessment is provided to examine how relevant the choice of FSTT is to increasing the accuracy of CFR. In conclusion, this study provides a new perspective in understanding the distribution of FSTT and the construction of a new densely calculated FSTT database for craniofacial reconstruction. Copyright © 2016. Published by Elsevier Ireland Ltd.
Hayat, Matthew J.; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L.
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals. PMID:28591190
Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.
Thermodynamics and statistical mechanics. [thermodynamic properties of gases
NASA Technical Reports Server (NTRS)
1976-01-01
The basic thermodynamic properties of gases are reviewed and the relations between them are derived from the first and second laws. The elements of statistical mechanics are then formulated and the partition function is derived. The classical form of the partition function is used to obtain the Maxwell-Boltzmann distribution of kinetic energies in the gas phase and the equipartition of energy theorem is given in its most general form. The thermodynamic properties are all derived as functions of the partition function. Quantum statistics are reviewed briefly and the differences between the Boltzmann distribution function for classical particles and the Fermi-Dirac and Bose-Einstein distributions for quantum particles are discussed.
Calculation of precise firing statistics in a neural network model
NASA Astrophysics Data System (ADS)
Cho, Myoung Won
2017-08-01
A precise prediction of neural firing dynamics is requisite to understand the function of and the learning process in a biological neural network which works depending on exact spike timings. Basically, the prediction of firing statistics is a delicate manybody problem because the firing probability of a neuron at a time is determined by the summation over all effects from past firing states. A neural network model with the Feynman path integral formulation is recently introduced. In this paper, we present several methods to calculate firing statistics in the model. We apply the methods to some cases and compare the theoretical predictions with simulation results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoilova, N. I.
Generalized quantum statistics, such as paraboson and parafermion statistics, are characterized by triple relations which are related to Lie (super)algebras of type B. The correspondence of the Fock spaces of parabosons, parafermions as well as the Fock space of a system of parafermions and parabosons to irreducible representations of (super)algebras of type B will be pointed out. Example of generalized quantum statistics connected to the basic classical Lie superalgebra B(1|1) ≡ osp(3|2) with interesting physical properties, such as noncommutative coordinates, will be given. Therefore the article focuses on the question, addressed already in 1950 by Wigner: do the equation ofmore » motion determine the quantum mechanical commutation relation?.« less
Network meta-analysis: a technique to gather evidence from direct and indirect comparisons
2017-01-01
Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sathaye, Jayant A.
2000-04-01
Integrated assessment (IA) modeling of climate policy is increasingly global in nature, with models incorporating regional disaggregation. The existing empirical basis for IA modeling, however, largely arises from research on industrialized economies. Given the growing importance of developing countries in determining long-term global energy and carbon emissions trends, filling this gap with improved statistical information on developing countries' energy and carbon-emissions characteristics is an important priority for enhancing IA modeling. Earlier research at LBNL on this topic has focused on assembling and analyzing statistical data on productivity trends and technological change in the energy-intensive manufacturing sectors of five developing countries,more » India, Brazil, Mexico, Indonesia, and South Korea. The proposed work will extend this analysis to the agriculture and electric power sectors in India, South Korea, and two other developing countries. They will also examine the impact of alternative model specifications on estimates of productivity growth and technological change for each of the three sectors, and estimate the contribution of various capital inputs--imported vs. indigenous, rigid vs. malleable-- in contributing to productivity growth and technological change. The project has already produced a data resource on the manufacturing sector which is being shared with IA modelers. This will be extended to the agriculture and electric power sectors, which would also be made accessible to IA modeling groups seeking to enhance the empirical descriptions of developing country characteristics. The project will entail basic statistical and econometric analysis of productivity and energy trends in these developing country sectors, with parameter estimates also made available to modeling groups. The parameter estimates will be developed using alternative model specifications that could be directly utilized by the existing IAMs for the manufacturing, agriculture, and electric power sectors.« less
Malik, Mohsan M; Hachach-Haram, Nadine; Tahir, Muaaz; Al-Musabi, Musab; Masud, Dhalia; Mohanna, Pari-Naz
2017-04-01
Acquisition of fine motor skills required in microsurgery can be challenging in the current training system. Therefore, there is an increased demand for novel training and assessment methods to optimise learning outside the clinical setting. Here, we present a randomised control trial of three microsurgical training models, namely laboratory tabletop training microscope (Laboratory Microscope, LM), low-cost jewellers microscope (Home Microscope, HM) and iPad trainer (Home Tablet, HT). Thirty-nine participants were allocated to four groups, control n = 9, LM n = 10, HM n = 10 and HT n = 10. The participants performed a chicken femoral artery anastomosis at baseline and at the completion of training. The performance was assessed as follows: structured assessment of microsurgery skills (SAMS) score, time taken to complete anastomosis and time for suture placement. No statistically significant difference was noted between the groups at baseline. There was a statistically significant improvement in all training arms between the baseline and post-training for SAMS score, time taken to complete the anastomosis and time per suture placement. In addition, a reduction was observed in the leak rate. No statistical difference was observed among the training arms. Our study demonstrated that at the early stages of microsurgical skill acquisition, home training using either the jewellers microscope or iPad produces comparable results to laboratory-based training using a tabletop microscope. Therefore, home microsurgical training is a viable, easily accessible cost-effective modality that allows trainees to practice and take ownership of their technical skill development in this area. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
USGS Research on Saline Waters Co-Produced with Energy Resources
,
1997-01-01
The United States energy industry faces the challenge of satisfying our expanding thirst for energy while protecting the environment. This challenge is magnified by the increasing volumes of saline water produced with oil and gas in the Nation's aging petroleum fields. Ultimately, energy-producing companies are responsible for disposing of these waters. USGS research provides basic information, for use by regulators, industry, and the public, about the chemistry of co-produced waters and environmentally acceptable ways of handling them.
The relevance of basic sciences in undergraduate medical education.
Lynch, C; Grant, T; McLoughlin, P; Last, J
2016-02-01
Evolving and changing undergraduate medical curricula raise concerns that there will no longer be a place for basic sciences. National and international trends show that 5-year programmes with a pre-requisite for school chemistry are growing more prevalent. National reports in Ireland show a decline in the availability of school chemistry and physics. This observational cohort study considers if the basic sciences of physics, chemistry and biology should be a prerequisite to entering medical school, be part of the core medical curriculum or if they have a place in the practice of medicine. Comparisons of means, correlation and linear regression analysis assessed the degree of association between predictors (school and university basic sciences) and outcomes (year and degree GPA) for entrants to a 6-year Irish medical programme between 2006 and 2009 (n = 352). We found no statistically significant difference in medical programme performance between students with/without prior basic science knowledge. The Irish school exit exam and its components were mainly weak predictors of performance (-0.043 ≥ r ≤ 0.396). Success in year one of medicine, which includes a basic science curriculum, was indicative of later success (0.194 ≥ r (2) ≤ 0.534). University basic sciences were found to be more predictive than school sciences in undergraduate medical performance in our institution. The increasing emphasis of basic sciences in medical practice and the declining availability of school sciences should mandate medical schools in Ireland to consider how removing basic sciences from the curriculum might impact on future applicants.
Experimental statistics for biological sciences.
Bang, Heejung; Davidian, Marie
2010-01-01
In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.
The modern Japanese color lexicon.
Kuriki, Ichiro; Lange, Ryan; Muto, Yumiko; Brown, Angela M; Fukuda, Kazuho; Tokunaga, Rumi; Lindsey, Delwin T; Uchikawa, Keiji; Shioiri, Satoshi
2017-03-01
Despite numerous prior studies, important questions about the Japanese color lexicon persist, particularly about the number of Japanese basic color terms and their deployment across color space. Here, 57 native Japanese speakers provided monolexemic terms for 320 chromatic and 10 achromatic Munsell color samples. Through k-means cluster analysis we revealed 16 statistically distinct Japanese chromatic categories. These included eight chromatic basic color terms (aka/red, ki/yellow, midori/green, ao/blue, pink, orange, cha/brown, and murasaki/purple) plus eight additional terms: mizu ("water")/light blue, hada ("skin tone")/peach, kon ("indigo")/dark blue, matcha ("green tea")/yellow-green, enji/maroon, oudo ("sand or mud")/mustard, yamabuki ("globeflower")/gold, and cream. Of these additional terms, mizu was used by 98% of informants, and emerged as a strong candidate for a 12th Japanese basic color term. Japanese and American English color-naming systems were broadly similar, except for color categories in one language (mizu, kon, teal, lavender, magenta, lime) that had no equivalent in the other. Our analysis revealed two statistically distinct Japanese motifs (or color-naming systems), which differed mainly in the extension of mizu across our color palette. Comparison of the present data with an earlier study by Uchikawa & Boynton (1987) suggests that some changes in the Japanese color lexicon have occurred over the last 30 years.
The United Nations Basic Space Science Initiative
NASA Astrophysics Data System (ADS)
Haubold, H. J.
2006-08-01
Pursuant to recommendations of the United Nations Conference on the Exploration and Peaceful Uses of Outer Space (UNISPACE III) and deliberations of the United Nations Committee on the Peaceful Uses of Outer Space (UNCOPUOS), annual UN/ European Space Agency workshops on basic space science have been held around the world since 1991. These workshops contribute to the development of astrophysics and space science, particularly in developing nations. Following a process of prioritization, the workshops identified the following elements as particularly important for international cooperation in the field: (i) operation of astronomical telescope facilities implementing TRIPOD, (ii) virtual observatories, (iii) astrophysical data systems, (iv) concurrent design capabilities for the development of international space missions, and (v) theoretical astrophysics such as applications of nonextensive statistical mechanics. Beginning in 2005, the workshops focus on preparations for the International Heliophysical Year 2007 (IHY2007). The workshops continue to facilitate the establishment of astronomical telescope facilities as pursued by Japan and the development of low-cost, ground-based, world-wide instrument arrays as lead by the IHY secretariat. Wamsteker, W., Albrecht, R. and Haubold, H.J.: Developing Basic Space Science World-Wide: A Decade of UN/ESA Workshops. Kluwer Academic Publishers, Dordrecht 2004. http://ihy2007.org http://www.unoosa.org/oosa/en/SAP/bss/ihy2007/index.html http://www.cbpf.br/GrupPesq/StatisticalPhys/biblio.htm
Army Operations in China. December 1941 - December 1943
1956-05-31
in the China Theater from the time of the outbreak of the Pacific war until. the end of 1943. Under the direction of the Reports and Statistical...S0ction of the Demobilization Bureau, the basic,manuscript was written by former Lt Col Heizo Ishiwa.ri, a former member of the War History Section of...4 Table of Cent ents CHAPTER 1 - Operations in Chinese Theater After the Outbreak of ’iJ]e Pacific War Situation in China Ai’ter the Outbreak of
Data Analysis Techniques for Physical Scientists
NASA Astrophysics Data System (ADS)
Pruneau, Claude A.
2017-10-01
Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.
Nelson, Barnaby; Thompson, Andrew; Chanen, Andrew M; Amminger, Günther Paul; Yung, Alison R
2013-08-01
Research in the phenomenological tradition suggests that the schizophrenia spectrum is characterized by disturbance of the 'basic' self, whereas borderline personality disorder involves disturbance of the 'narrative' self. The current study investigated this proposal in an ultra-high risk for psychosis sample. The sample consisted of 42 ultra-high-risk participants with a mean age of 19.22 years. Basic self-disturbance was measured using the Examination of Anomalous Self-Experience. Borderline personality pathology was measured using the borderline personality disorder items from the structured clinical interview for DSM-IV (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition) Axis II Personality Questionnaire. No correlation was found between the measures of basic self-disturbance and borderline personality pathology. The finding is consistent with the proposal that different (although not mutually exclusive) types of self-disturbance characterize the schizophrenia spectrum and borderline personality disorder. Further research should further examine the question of basic self-disturbance in patients with established borderline personality disorder. © 2013 Wiley Publishing Asia Pty Ltd.
Comparison of traditional six-year and new four-year dental curricula in South Korea.
Komabayashi, Takashi; Ahn, Chul; Kim, Kang-Ju; Oh, Hyo-Won
2012-01-01
This study aimed to compare the dental curriculum of the traditional six-year system with that of the new four-year (graduate-entry) system in South Korea. There are 11 dental schools in South Korea: six are public and five are private. Eight offer the new four-year program and the other three offer the traditional six-year program. Descriptive analyses were conducted using bibliographic data and local information along with statistical analyses such as chi-square tests. In the six-year programs, clinical dentistry subjects were taught almost equally in practical and didactic courses, while the basic science courses were taught more often as practical courses (P < 0.0001). In the four-year programs, both the basic science and clinical dentistry subjects were taught didactically more often; while more dentistry subjects were taught than basic sciences (P = 0.004). The four-year program model in South Korea is more focused on dentistry than on basic science, while both basic and clinical dentistry subjects were equally taught in the six-year program.
The 1985 Army Experience Survey. Data Sourcebook and User’s Manual
1986-01-01
on the survey data file produced for the 1985 AES.- 4 The survey data are available in Operating System (OS) as well as Statistical Analysis System ...version of the survey data files was produced using the Statistical Analysis System (SASJ. The survey data were also produced in Operating System (OS...impacts upon future enlistments. In order iThe OS data file was designed to make the survey data accessible on any IBM-compatible computer system . 3 N’ to
Fetal Alcohol Spectrum Disorders (FASDs): Data and Statistics
... alcohol screening and counseling for all women Data & Statistics Recommend on Facebook Tweet Share Compartir Prevalence of ... conducted annually by the National Center for Health Statistics (NCHS), CDC, to produce national estimates for a ...
Ballabeni, Andrea; Boggio, Andrea; Hemenway, David
2014-01-01
Basic research in the biomedical field generates both knowledge that has a value per se regardless of its possible practical outcome and knowledge that has the potential to produce more practical benefits. Policies can increase the benefit potential to society of basic biomedical research by offering various kinds of incentives to basic researchers. In this paper we argue that soft incentives or “nudges” are particularly promising. However, to be well designed, these incentives must take into account the motivations, goals and views of the basic scientists. In the paper we present the results of an investigation that involved more than 300 scientists at Harvard Medical School and affiliated institutes. The results of this study suggest that some soft incentives could be valuable tools to increase the transformative value of fundamental investigations without affecting the spirit of the basic research and scientists’ work satisfaction. After discussing the findings, we discuss a few examples of nudges for basic researchers in the biomedical fields. PMID:24795807
Ballabeni, Andrea; Boggio, Andrea; Hemenway, David
2014-01-01
Basic research in the biomedical field generates both knowledge that has a value per se regardless of its possible practical outcome and knowledge that has the potential to produce more practical benefits. Policies can increase the benefit potential to society of basic biomedical research by offering various kinds of incentives to basic researchers. In this paper we argue that soft incentives or "nudges" are particularly promising. However, to be well designed, these incentives must take into account the motivations, goals and views of the basic scientists. In the paper we present the results of an investigation that involved more than 300 scientists at Harvard Medical School and affiliated institutes. The results of this study suggest that some soft incentives could be valuable tools to increase the transformative value of fundamental investigations without affecting the spirit of the basic research and scientists' work satisfaction. After discussing the findings, we discuss a few examples of nudges for basic researchers in the biomedical fields.
Alternative Fuels Data Center: Ethanol Fuel Basics
ethanol. Ethanol Energy Balance In the United States, 95% of ethanol is produced from the starch in corn demonstrates a positive energy balance, meaning that the process of producing ethanol fuel does not require energy balance of ethanol because the feedstocks are either waste, co-products of another industry (wood
NASA Astrophysics Data System (ADS)
Goudarzi, A. M.; Mazandarani, P.; Panahi, R.; Behsaz, H.; Rezania, A.; Rosendahl, L. A.
2013-07-01
Traditional fire stoves are characterized by low efficiency. In this experimental study, the combustion chamber of the stove is augmented by two devices. An electric fan can increase the air-to-fuel ratio in order to increase the system's efficiency and decrease air pollution by providing complete combustion of wood. In addition, thermoelectric generators (TEGs) produce power that can be used to satisfy all basic needs. In this study, a water-based cooling system is designed to increase the efficiency of the TEGs and also produce hot water for residential use. Through a range of tests, an average of 7.9 W was achieved by a commercial TEG with substrate area of 56 mm × 56 mm, which can produce 14.7 W output power at the maximum matched load. The total power generated by the stove is 166 W. Also, in this study a reasonable ratio of fuel to time is described for residential use. The presented prototype is designed to fulfill the basic needs of domestic electricity, hot water, and essential heat for warming the room and cooking.
Russian Schools: The Information Revolution Continues
ERIC Educational Resources Information Center
Zair-Bek, S. I.; Belikov, A. A.; Plekhanov, A. A.
2017-01-01
This article presents educational statistics that reflect the basic indicators describing the state of information technology infrastructure in secondary general education in 2014. This research seeks to analyze how Russia's Federal State Educational Standards governing secondary general education facilitate the creation of information-based…
ERIC Educational Resources Information Center
Perry, Mike; Kader, Gary
1998-01-01
Presents an activity on the simplification of penguin counting by employing the basic ideas and principles of sampling to teach students to understand and recognize its role in statistical claims. Emphasizes estimation, data analysis and interpretation, and central limit theorem. Includes a list of items for classroom discussion. (ASK)
45 CFR 63.1 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-10-01
... services. Such information is obtained through the conduct of basic and applied research, statistical... apply to all grant awards of Federal assistance made by the Assistant Secretary for Planning and... may be delegated to the Assistant Secretary for policy research activities. (b) Exceptions to...
DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES
Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...
ERIC Educational Resources Information Center
Bureau of Naval Personnel, Washington, DC.
Basic information on petroleum is presented in this book prepared for naval logistics officers. Petroleum in national defense is discussed in connection with consumption statistics, productive capacity, world's resources, and steps in logistics. Chemical and geological analyses are made in efforts to familiarize methods of refining, measuring,…
Pigeons, Facebook and the Birthday Problem
ERIC Educational Resources Information Center
Russell, Matthew
2013-01-01
The unexpectedness of the birthday problem has long been used by teachers of statistics in discussing basic probability calculation. An activity is described that engages students in understanding probability and sampling using the popular Facebook social networking site. (Contains 2 figures and 1 table.)
ERIC Educational Resources Information Center
Coyle, Daniel
1979-01-01
Lists and annotates recurrent federal publications that contain basic statistical data on pollution levels and controls, natural resources and wildlife conservation, water resources supply and development, weather and ocean conditions, federal aid programs, and the environmental impact of energy development. It also lists continuing bibliographies…
Mechanics: Statics; A Syllabus.
ERIC Educational Resources Information Center
Compo, Louis
The instructor's guide presents material for structuring an engineering fundamentals course covering the basic laws of statistics as part of a mechanical technology program. Detailed behavioral objectives are described for the following five areas of course content: principles of mechanics, two-dimensional equilibrium, equilibrium of internal…
Dissociation between facial and bodily expressions in emotion recognition: A case study.
Leiva, Samanta; Margulis, Laura; Micciulli, Andrea; Ferreres, Aldo
2017-12-21
Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite's operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. There were no statistically significant differences between the patient's and the control group's performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient's performance was compared to the control group's, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient's and the control group's correct answers for emotional facial stimuli. Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.
Harrison-Bernard, Lisa M; Naljayan, Mihran V; Eason, Jane M; Mercante, Donald E; Gunaldo, Tina P
2017-12-01
The primary purpose of conducting an interprofessional education (IPE) experience during the renal physiology block of a graduate-level course was to provide basic science, physical therapy, and physician assistant graduate students with an opportunity to work as a team in the diagnosis, treatment, and collaborative care of a patient with acute kidney injury. The secondary purpose was to enhance the understanding of basic renal physiology principles with a patient case presentation of renal pathophysiology. The overall purpose was to assess the value of IPE integration within a basic science course by examining student perceptions and program evaluation. Graduate-level students operated in interprofessional teams while working through an acute kidney injury patient case. The following Interprofessional Education Collaborative subcompetencies were targeted: Roles/Responsibilities (RR) Behavioral Expectations (RR1, RR4) and Interprofessional Communication (CC) Behavioral Expectations (CC4). Clinical and IPE stimulus questions were discussed both within and between teams with assistance provided by faculty facilitators. Students were given a pre- and postsurvey to determine their knowledge of IPE. There were statistically significant increases from pre- to postsurvey scores for all six IPE questions for all students. Physical therapy and physician assistant students had a statistically significant increase in pre- to postsurvey scores, indicating a more favorable perception of their interprofessional competence for RR1, RR4, and CC4. No changes were noted in pre- to postsurvey scores for basic science graduate students. Incorporating planned IPE experiences into multidisciplinary health science courses represents an appropriate venue to have students learn and apply interprofessional competencies. Copyright © 2017 the American Physiological Society.
Vertical integration of basic science in final year of medical education
Rajan, Sudha Jasmine; Jacob, Tripti Meriel; Sathyendra, Sowmya
2016-01-01
Background: Development of health professionals with ability to integrate, synthesize, and apply knowledge gained through medical college is greatly hampered by the system of delivery that is compartmentalized and piecemeal. There is a need to integrate basic sciences with clinical teaching to enable application in clinical care. Aim: To study the benefit and acceptance of vertical integration of basic science in final year MBBS undergraduate curriculum. Materials and Methods: After Institutional Ethics Clearance, neuroanatomy refresher classes with clinical application to neurological diseases were held as part of the final year posting in two medical units. Feedback was collected. Pre- and post-tests which tested application and synthesis were conducted. Summative assessment was compared with the control group of students who had standard teaching in other two medical units. In-depth interview was conducted on 2 willing participants and 2 teachers who did neurology bedside teaching. Results: Majority (>80%) found the classes useful and interesting. There was statistically significant improvement in the post-test scores. There was a statistically significant difference between the intervention and control groups' scores during summative assessment (76.2 vs. 61.8 P < 0.01). Students felt that it reinforced, motivated self-directed learning, enabled correlations, improved understanding, put things in perspective, gave confidence, aided application, and enabled them to follow discussions during clinical teaching. Conclusion: Vertical integration of basic science in final year was beneficial and resulted in knowledge gain and improved summative scores. The classes were found to be useful, interesting and thought to help in clinical care and application by majority of students. PMID:27563584
Byun, Jung-Eun; Kang, Eun-Bum
2016-06-01
This study was to investigate the impacts of senior brain heath exercise (SBHE) program for 12 weeks to basic active physical fitness, cognitive function and brain derived neurotrophic factor (BDNF) in elderly women. Subject of this study is total of 24 women in the age of 65-79 who can conduct normal daily activity and communication but have not participated in regular exercise in recent 6 months. The study groups were divided into an exercise group (EG, n=13) and a control group (CG, n=11). The exercise program was consisted of SBHE, and training frequency was 4 times weekly, of which training time was a total of 50 minutes each time in level of intensity of 9-14 by rating of perceived exertion (RPE). First, 12-week SBHE program has shown statistical increase in basic physical fitness in the EG comparing with the CG, such as lower body strength, upper body strength and aerobic endurance, but not in flexibility, agility and dynamic balance. Second, in the case of Mini-mental state examination Korean version (MMSE-K) and BDNF, it showed that there was a statistically significant increase in the EG comparing with the CG. In this study, 12-week SBHE program has resulted in positive effect on change of basic physical fitness (strength and aerobic endurance), cognitive function and BDNF. If above program adds movements that can enhance flexibility, dynamic balance and agility, this can be practical exercise program to help seniors maintain overall healthy lifestyle.
Tools for computer graphics applications
NASA Technical Reports Server (NTRS)
Phillips, R. L.
1976-01-01
Extensive research in computer graphics has produced a collection of basic algorithms and procedures whose utility spans many disciplines. These tools are described in terms of their fundamental aspects, implementations, applications, and availability. Programs which are discussed include basic data plotting, curve smoothing, and depiction of three dimensional surfaces. As an aid to potential users of these tools, particular attention is given to discussing their availability and, where applicable, their cost.
Integration of basic sciences and clinical sciences in oral radiology education for dental students.
Baghdady, Mariam T; Carnahan, Heather; Lam, Ernest W N; Woods, Nicole N
2013-06-01
Educational research suggests that cognitive processing in diagnostic radiology requires a solid foundation in the basic sciences and knowledge of the radiological changes associated with disease. Although it is generally assumed that dental students must acquire both sets of knowledge, little is known about the most effective way to teach them. Currently, the basic and clinical sciences are taught separately. This study was conducted to compare the diagnostic accuracy of students when taught basic sciences segregated or integrated with clinical features. Predoctoral dental students (n=51) were taught four confusable intrabony abnormalities using basic science descriptions integrated with the radiographic features or taught segregated from the radiographic features. The students were tested with diagnostic images, and memory tests were performed immediately after learning and one week later. On immediate and delayed testing, participants in the integrated basic science group outperformed those from the segregated group. A main effect of learning condition was found to be significant (p<0.05). The results of this study support the critical role of integrating biomedical knowledge in diagnostic radiology and shows that teaching basic sciences integrated with clinical features produces higher diagnostic accuracy in novices than teaching basic sciences segregated from clinical features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapp, Henry P.
2011-05-10
The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determinedmore » by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.« less
Martinson, Brian C; Mohr, David C; Charns, Martin P; Nelson, David; Hagel-Campbell, Emily; Bangerter, Ann; Bloomfield, Hanna E; Owen, Richard; Thrush, Carol R
2017-01-01
Assessing the integrity of research climates and sharing such information with research leaders may support research best practices. We report here results of a pilot trial testing the effectiveness of a reporting and feedback intervention using the Survey of Organizational Research Climate (SOuRCe). We randomized 41 Veterans Health Administration (VA) facilities to a phone-based intervention designed to help research leaders understand their survey results (enhanced arm) or to an intervention in which results were simply distributed to research leaders (basic arm). Primary outcomes were (1) whether leaders took action, (2) whether actions taken were consistent with the feedback received, and (3) whether responses differed by receptivity to quality improvement input. Research leaders from 25 of 42 (59%) VA facilities consented to participate in the study intervention and follow-up, of which 14 were at facilities randomized to the enhanced arm. We completed follow-up interviews with 21 of the 25 leaders (88%), 12 from enhanced arm facilities. While not statistically significant, the proportion of leaders reporting taking some action in response to the feedback was twice as high in the enhanced arm than in the basic arm (67% vs. 33%, p = .20). While also not statistically significant, a higher proportion of actions taken among facilities in the enhanced arm were responsive to the survey results than in the basic arm (42% vs. 22%, p = .64). Enhanced feedback of survey results appears to be a promising intervention that may increase the likelihood of responsive action to improve organizational climates. Due to the small sample size of this pilot study, even large percentage-point differences between study arms are not statistically distinguishable. This hypothesis should be tested in a larger trial.
Anandakrishnan, Ramu; Onufriev, Alexey
2008-03-01
In statistical mechanics, the equilibrium properties of a physical system of particles can be calculated as the statistical average over accessible microstates of the system. In general, these calculations are computationally intractable since they involve summations over an exponentially large number of microstates. Clustering algorithms are one of the methods used to numerically approximate these sums. The most basic clustering algorithms first sub-divide the system into a set of smaller subsets (clusters). Then, interactions between particles within each cluster are treated exactly, while all interactions between different clusters are ignored. These smaller clusters have far fewer microstates, making the summation over these microstates, tractable. These algorithms have been previously used for biomolecular computations, but remain relatively unexplored in this context. Presented here, is a theoretical analysis of the error and computational complexity for the two most basic clustering algorithms that were previously applied in the context of biomolecular electrostatics. We derive a tight, computationally inexpensive, error bound for the equilibrium state of a particle computed via these clustering algorithms. For some practical applications, it is the root mean square error, which can be significantly lower than the error bound, that may be more important. We how that there is a strong empirical relationship between error bound and root mean square error, suggesting that the error bound could be used as a computationally inexpensive metric for predicting the accuracy of clustering algorithms for practical applications. An example of error analysis for such an application-computation of average charge of ionizable amino-acids in proteins-is given, demonstrating that the clustering algorithm can be accurate enough for practical purposes.
Taylor, Kirsten I; Devereux, Barry J; Acres, Kadia; Randall, Billi; Tyler, Lorraine K
2012-03-01
Conceptual representations are at the heart of our mental lives, involved in every aspect of cognitive functioning. Despite their centrality, a long-standing debate persists as to how the meanings of concepts are represented and processed. Many accounts agree that the meanings of concrete concepts are represented by their individual features, but disagree about the importance of different feature-based variables: some views stress the importance of the information carried by distinctive features in conceptual processing, others the features which are shared over many concepts, and still others the extent to which features co-occur. We suggest that previously disparate theoretical positions and experimental findings can be unified by an account which claims that task demands determine how concepts are processed in addition to the effects of feature distinctiveness and co-occurrence. We tested these predictions in a basic-level naming task which relies on distinctive feature information (Experiment 1) and a domain decision task which relies on shared feature information (Experiment 2). Both used large-scale regression designs with the same visual objects, and mixed-effects models incorporating participant, session, stimulus-related and feature statistic variables to model the performance. We found that concepts with relatively more distinctive and more highly correlated distinctive relative to shared features facilitated basic-level naming latencies, while concepts with relatively more shared and more highly correlated shared relative to distinctive features speeded domain decisions. These findings demonstrate that the feature statistics of distinctiveness (shared vs. distinctive) and correlational strength, as well as the task demands, determine how concept meaning is processed in the conceptual system. Copyright © 2011 Elsevier B.V. All rights reserved.
Yang, J J
1995-01-01
Norway is governed by a three-tier parliamentary system where each tier is governed by a popularly selected body: the national parliament, the county councils, and the municipality councils. This three-tier system is in many ways also reflected in the organization, management, and financing of health and social services. A large amount of information (e.g.,statistics and annual reports) flows between the three levels of management. In order to have a proper and efficient information flow, The Norwegian Ministry of Health and Social Affairs has, since 1992, been conducting a nation-wide project for information collection from and feedback to municipal health and social services (see Figure 1). In this presentation, we will present the basic idea behind The Wheel. We will also discuss some of the major activities in and experiences from the project of using Information Technology to implement an electronic Wheel. The following are basic issues to consider in implementing such a system, related to the following basic issues in implementing such a system [1]: Obtaining a unified information basis to: increase the data quality, and compile "definition catalogs" that contain commonly agreed-upon definitions of central concepts and data sets that are used in the municipal health and social services [2]. Achieving electronic data collection, both in terms of the automatic selection and aggregation of relevant data from operational systems in the municipalities and in terms of using Electronic Forms. Experiments with various ways of electronically feeding back the statistics and other comparative data to the municipalities. Providing the municipal users with appropriate tools for using the statistics that are fed back.
Analysis of Variance: What Is Your Statistical Software Actually Doing?
ERIC Educational Resources Information Center
Li, Jian; Lomax, Richard G.
2011-01-01
Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…
Numerical study of canister filters with alternatives filter cap configurations
NASA Astrophysics Data System (ADS)
Mohammed, A. N.; Daud, A. R.; Abdullah, K.; Seri, S. M.; Razali, M. A.; Hushim, M. F.; Khalid, A.
2017-09-01
Air filtration system and filter play an important role in getting a good quality air into turbo machinery such as gas turbine. The filtration system and filter has improved the quality of air and protect the gas turbine part from contaminants which could bring damage. During separation of contaminants from the air, pressure drop cannot be avoided but it can be minimized thus helps to reduce the intake losses of the engine [1]. This study is focused on the configuration of the filter in order to obtain the minimal pressure drop along the filter. The configuration used is the basic filter geometry provided by Salutary Avenue Manufacturing Sdn Bhd. and two modified canister filter cap which is designed based on the basic filter model. The geometries of the filter are generated by using SOLIDWORKS software and Computational Fluid Dynamics (CFD) software is used to analyse and simulates the flow through the filter. In this study, the parameters of the inlet velocity are 0.032 m/s, 0.063 m/s, 0.094 m/s and 0.126 m/s. The total pressure drop produce by basic, modified filter 1 and 2 is 292.3 Pa, 251.11 Pa and 274.7 Pa. The pressure drop reduction for the modified filter 1 is 41.19 Pa and 14.1% lower compared to basic filter and the pressure drop reduction for modified filter 2 is 17.6 Pa and 6.02% lower compared to the basic filter. The pressure drops for the basic filter are slightly different with the Salutary Avenue filter due to limited data and experiment details. CFD software are very reliable in running a simulation rather than produces the prototypes and conduct the experiment thus reducing overall time and cost in this study.
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
[The extraneuronal cholinergic system of the skin. Basic facts and clinical relevance].
Kurzen, H
2004-05-01
Acetylcholine (ACh) is a prototypical neurotransmitter that has recently been recognized to occur extraneuronally in a large variety of cells. ACh and its nicotinic and muscarinic receptors are produced in the epidermis and in the adnexal structures of the skin in a highly complicated pattern. They are also produced in melanocytes, fibroblasts, endothelial cells and immune cells. Through autocrine, paracrine and endocrine mechanisms, the cholinergic system is involved in the basic functions of the skin, such as keratinocyte differentiation, epidermal barrier formation, sweating, sebum production, blood circulation, angiogenesis and a variety of immune reactions. Hence diseases like acne vulgaris, vitiligo, psoriasis, pemphigus vulgaris and atopic dermatitis may be influenced. The exploration of the extraneuronal cholinergic system of the skin has only just begun.
A κ-generalized statistical mechanics approach to income analysis
NASA Astrophysics Data System (ADS)
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2009-02-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.
NASA Astrophysics Data System (ADS)
Raab, Alexandra; Schneider, Anna; Bonhage, Alexander; Takla, Melanie; Hirsch, Florian; Müller, Frank; Rösler, Horst; Heußner, Karl-Uwe
2016-04-01
Archaeological excavations have revealed more than thousand charcoal kiln remains (CKRs) in the prefield of the active opencast lignite mine Jänschwalde, situated about 150 km SE of Berlin (SE Brandenburg, Germany). The charcoal was mainly produced for the ironwork Peitz nearby, which operated from the 16th to the mid-19th centuries. In a first approach, to estimate the dimension of the charcoal production, CKRs were mapped on shaded-relief maps (SRMs) derived from high-resolution LiDAR data (Raab et al. 2015). Subsequently, for a selected test area, identified CKRs on the SRMs were compared with archaeologically excavated CKRs in the field. This survey showed a considerably number of falsely detected sites. Therefore, the data was critically re-evaluated using additional relief visualisations. Further, we extended the CKR mapping to areas which are not archaeologically investigated. The study area, the former royal forest district Tauer, consists of two separate areas: the Tauersche Heide (c. 96 km2 area) N of Peitz and the area Jänschwalde (c. 32 km2 area) NE of Peitz. The study area is characterized by a flat topography. Different former and current anthropogenic uses (e.g., military training, solar power plant, forestry measures) have affected the study area, resulting in extensive disturbances of the terrain surface. The revised CKR abundance in the study area Jänschwalde was considerably smaller than the numbers produced by our first approach. Further, the CKR mapping revealed, that a total record of the CKRs is not possible for various reasons. Despite these limitations, a solid database can be provided for a much larger area than before. Basic statistic parameters of the CKR diameters and all comparative statistical tests were calculated using SPSS. To detect underlying spatial relationships in the CKR site distribution, we applied the Getis-Ord Gi* statistic, a method to test for local spatial autocorrelation between neighbouring sites. The test is available as part of the ArcGis 10.1 spatial statistics toolbox. The outcomes are discussed in consideration of our archaeological, archival and dendrochronological research results. Raab, A., Takla, M., Raab, T., Nicolay, A., Schneider, A., Rösler, H., et al. (2015). Pre-industrial charcoal production in Lower Lusatia (Brandenburg, Germany): Detection and evaluation of a large charcoal-burning field by combining archaeological studies, GIS-based analyses of shaded-relief maps and dendrochronological age determination. Quaternary International, doi:http://dx.doi.org/10.1016/j.quaint.2014.09.041.
Propagation of Interplanetary Disturbances in the Outer Heliosphere
NASA Technical Reports Server (NTRS)
Wang, Chi
2005-01-01
Contents include the following: 1. We have developed a one-dimensional, spherically symmetric, multi-fluid MHD model that includes solar wind protons and electrons, pickup ions, and interstellar neutral hydrogen. This model advances the existing solar wind models for the outer heliosphere in two important ways: one is that it distinguishes solar wind protons from pickup ions, and the other is that it allows for energy transfer from pickup ions to the solar wind protons. Model results compare favorably with the Voyager 2 observations. 2. 2. Solar wind slowdown and interstellar neutral density. The solar wind in the outer heliosphere is fundamentally different from that in the inner heliosphere since the effects of interstellar neutrals become significant. 3. ICME propagation from the inner to outer heliosphere. Large coronal mass ejections (CMEs) have major effects on the structure of the solar wind and the heliosphere. The plasma and magnetic field can be compressed ahead of interplanetary CMEs. 4. During the current solar cycle (Cycle 23), several major CMEs associated with solar flares produced large transient shocks which were observed by widely-separated spacecraft such as Wind at Earth and Voyager 2 beyond 60 AU. Using data from these spacecraft, we use the multi-fluid model to investigate shock propagation and interaction in the heliosphere. Specifically, we studied the Bastille Day 2000, April 2001 and Halloween 2003 events. 5. Statistical properties of the solar wind in the outer heliosphere. In a collaboration with L.F. Burlaga of GSFC, it is shown that the basic statistical properties of the solar wind in the outer heliosphere can be well produced by our model. We studied the large-scale heliospheric magnetic field strength fluctuations as a function of distance from the Sun during the declining phase of a solar cycle, using our numerical model with observations made at 1 AU during 1995 as input. 6. Radial heliospheric magnetic field events. The heliospheric magnetic field (HMF) direction, on average, conforms well to the Parker spiral.
NASA Astrophysics Data System (ADS)
Century, Daisy Nelson
This probing study focused on alternative and traditional assessments, their comparative impacts on students' attitudes and science learning outcomes. Four basic questions were asked: What type of science learning stemming from the instruction can best be assessed by the use of traditional paper-and pencil test? What type of science learning stemming from the instruction can best be assessed by the use of alternative assessment? What are the differences in the types of learning outcomes that can be assessed by the use of paper-pencil test and alternative assessment test? Is there a difference in students' attitude towards learning science when assessment of outcomes is by alternative assessment means compared to traditional means compared to traditional means? A mixed methodology involving quantitative and qualitative techniques was utilized. However, the study was essentially a case study. Quantitative data analysis included content achievement and attitude results, to which non-parametric statistics were applied. Analysis of qualitative data was done as a case study utilizing pre-set protocols resulting in a narrative summary style of report. These outcomes were combined in order to produce conclusions. This study revealed that the traditional method yielded more concrete cognitive content learning than did the alternative assessment. The alternative assessment yielded more psychomotor, cooperative learning and critical thinking skills. In both the alternative and the traditional methods the student's attitudes toward science were positive. There was no significant differences favoring either group. The quantitative findings of no statistically significant differences suggest that at a minimum there is no loss in the use of alternative assessment methods, in this instance, performance testing. Adding the results from the qualitative analysis to this suggests (1) that class groups were more satisfied when alternative methods were employed, and (2) that the two assessment methodologies are complementary to each other, and thus should probably be used together to produce maximum benefit.
Environmental flow allocation and statistics calculator
Konrad, Christopher P.
2011-01-01
The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.
Ector, Hugo
2010-12-01
I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.
The Technologies of EXPER SIM.
ERIC Educational Resources Information Center
Hedberg, John G.
EXPER SIM has been translated into two basic software systems: the Michigan Experimental Simulation Supervisor (MESS) and Louisville Experiment Simulation Supervisor (LESS). MESS and LESS have been programed to facilitate student interaction with the computer for research purposes. The programs contain models for several statistical analyses, and…
Design Considerations for Creating a Chemical Information Workstation.
ERIC Educational Resources Information Center
Mess, John A.
1995-01-01
Discusses what a functional chemical information workstation should provide to support the users in an academic library and examines how it can be implemented. Highlights include basic design considerations; natural language interface, including grammar-based, context-based, and statistical methodologies; expert system interface; and programming…
Electronic holographic moire in the micron range
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Sciammarella, Federico M.
2001-06-01
The basic theory behind microscopic electronic holographic moire is presented. Conditions of observation are discussed, and optimal parameters are established. An application is presented as an example where experimental result are statistically analyzed and successfully correlated with an independent method of measurement of the same quantity.
Installing a Practical Research Project and Interpreting Research Results
Kasten R. Dumroese; David L. Wenny
2003-01-01
The basic concepts of the scientific method and research process are reviewed. An example from a bareroot nursery demonstrates how a practical research project can be done at any type of nursery, meshing sound statistical principles with the limitations of busy nursery managers.
Landowner interest in multifunctional agroforestry riparian buffers.
Katie Trozzo; John Munsell; James Chamberlain
2014-01-01
Adoption of temperate agroforestry practices generally remains limited despite considerable advances in basic science. This study builds on temperate agroforestry adoption research by empirically testing a statistical model of interest in native fruit and nut tree riparian buffers using technology and agroforestry adoption theory. Data...
77 FR 59989 - Labor Surplus Area Classification Under Executive Orders
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-01
... DEPARTMENT OF LABOR Employment and Training Administration Labor Surplus Area Classification Under... Bureau of Labor Statistics are used in making these classifications. The average unemployment rate for all states includes data for the Commonwealth of Puerto Rico. The basic LSA classification criteria...
Burdo, Joseph R
2013-01-01
Since 2009 at Boston College, we have been offering a Research in Neuroscience course using cultured neurons in an in vitro model of stroke. The students work in groups to learn how to perform sterile animal cell culture and run several basic bioassays to assess cell viability. They are then tasked with analyzing the scientific literature in an attempt to identify and predict the intracellular pathways involved in neuronal death, and identify dietary antioxidant compounds that may provide protection based on their known effects in other cells. After each group constructs a hypothesis pertaining to the potential neuroprotection, we purchase one compound per group and the students test their hypotheses using a commonly performed viability assay. The groups generate quantitative data and perform basic statistics on that data to analyze it for statistical significance. Finally, the groups compile their data and other elements of their research experience into a poster for our departmental research celebration at the end of the spring semester.
Burdo, Joseph R.
2013-01-01
Since 2009 at Boston College, we have been offering a Research in Neuroscience course using cultured neurons in an in vitro model of stroke. The students work in groups to learn how to perform sterile animal cell culture and run several basic bioassays to assess cell viability. They are then tasked with analyzing the scientific literature in an attempt to identify and predict the intracellular pathways involved in neuronal death, and identify dietary antioxidant compounds that may provide protection based on their known effects in other cells. After each group constructs a hypothesis pertaining to the potential neuroprotection, we purchase one compound per group and the students test their hypotheses using a commonly performed viability assay. The groups generate quantitative data and perform basic statistics on that data to analyze it for statistical significance. Finally, the groups compile their data and other elements of their research experience into a poster for our departmental research celebration at the end of the spring semester. PMID:23805059
ecode - Electron Transport Algorithm Testing v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franke, Brian C.; Olson, Aaron J.; Bruss, Donald Eugene
2016-10-05
ecode is a Monte Carlo code used for testing algorithms related to electron transport. The code can read basic physics parameters, such as energy-dependent stopping powers and screening parameters. The code permits simple planar geometries of slabs or cubes. Parallelization consists of domain replication, with work distributed at the start of the calculation and statistical results gathered at the end of the calculation. Some basic routines (such as input parsing, random number generation, and statistics processing) are shared with the Integrated Tiger Series codes. A variety of algorithms for uncertainty propagation are incorporated based on the stochastic collocation and stochasticmore » Galerkin methods. These permit uncertainty only in the total and angular scattering cross sections. The code contains algorithms for simulating stochastic mixtures of two materials. The physics is approximate, ranging from mono-energetic and isotropic scattering to screened Rutherford angular scattering and Rutherford energy-loss scattering (simple electron transport models). No production of secondary particles is implemented, and no photon physics is implemented.« less
Awareness of basic life support among Saudi dental students and interns.
Al-Shamiri, Hashem Motahir; Al-Maweri, Sadeq Ali; Shugaa-Addin, Bassam; Alaizari, Nader Ahmed; Hunaish, Abdulrahman
2017-01-01
Fatal medical emergencies may occur at any time in the dental clinic. The present study assessed the level of awareness and attitudes toward basic life support (BLS) among Saudi dental students and interns. A self-administered questionnaire comprising 23 closed-ended questions was used in this survey. The first part of the questionnaire assessed the demographical profile of the students such as age, gender, and educational level. The second part investigated their knowledge and awareness about BLS. Data from 203 respondents were analyzed using Statistical Package for the Social Studies version 22.0. The response rate was 81.2%. Overall, the respondents showed a low level of knowledge with significant differences between males and females (<0.001). Surprisingly, final-year dental students showed relatively better knowledge than interns though the differences were not statistically significant. The present study demonstrates poor knowledge among dental students regarding BLS and showed the urgent need for continuous refreshing courses for this critical topic.
Multivariate assessment of event-related potentials with the t-CWT method.
Bostanov, Vladimir
2015-11-05
Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.
SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel
Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari
2009-01-01
Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806
SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.
Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari
2009-10-23
Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.
The Lake Tahoe Basin Land Use Simulation Model
Forney, William M.; Oldham, I. Benson
2011-01-01
This U.S. Geological Survey Open-File Report describes the final modeling product for the Tahoe Decision Support System project for the Lake Tahoe Basin funded by the Southern Nevada Public Land Management Act and the U.S. Geological Survey's Geographic Analysis and Monitoring Program. This research was conducted by the U.S. Geological Survey Western Geographic Science Center. The purpose of this report is to describe the basic elements of the novel Lake Tahoe Basin Land Use Simulation Model, publish samples of the data inputs, basic outputs of the model, and the details of the Python code. The results of this report include a basic description of the Land Use Simulation Model, descriptions and summary statistics of model inputs, two figures showing the graphical user interface from the web-based tool, samples of the two input files, seven tables of basic output results from the web-based tool and descriptions of their parameters, and the fully functional Python code.
1989-08-01
number) Using chi-square tests of homogeneity, a selected sample of Army Basic Trainees at Ft. Jackso was studied to determine if there was a...Period of training for sample soldiers was January to May 1985. Results of testing for the female trainees indicated no significant difference in incidence...of ARD among three barracks groups. Results of testing for male trainees indicated statistically significant dif -erences of ARD among each of three
Basic principles of Hasse diagram technique in chemistry.
Brüggemann, Rainer; Voigt, Kristina
2008-11-01
Principles of partial order applied to ranking are explained. The Hasse diagram technique (HDT) is the application of partial order theory based on a data matrix. In this paper, HDT is introduced in a stepwise procedure, and some elementary theorems are exemplified. The focus is to show how the multivariate character of a data matrix is realized by HDT and in which cases one should apply other mathematical or statistical methods. Many simple examples illustrate the basic theoretical ideas. Finally, it is shown that HDT is a useful alternative for the evaluation of antifouling agents, which was originally performed by amoeba diagrams.
[Nurse supervision in health basic units].
Correia, Valesca Silveira; Servo, Maria Lúcia Silva
2006-01-01
This qualitative study intends to evaluate the pattern of supervision of the nurse in health basic units, in Feira de Santana city (Bahía-Brasil), between August 2001 and June 2002. The objective was to describe the supervision and the existence of supervision systematics for the nurse. A questionnaire was used to take informations from a group of sixteen (16) nurses in actual professional work. Descriptive statistical procedures for data analysis were used. It can be concluded that systematic supervision is practiced in 64% of the nurses and in 36% of the cases systematic supervision do not occur.
Entrepreneurs: Women and Minorities.
ERIC Educational Resources Information Center
Akers, Lilialyce
A program was designed to meet the needs of Kentucky women who wished to supplement their incomes by producing articles in their homes for sale. Its three-phase objective was to identify women who already had knitting skills and train them to produce a finished product; to provide basic knowledge about how to run a small business; and to provide…
USDA-ARS?s Scientific Manuscript database
Strains of Pseudomonas fluorescens that produce the antibiotic 2,4-diacetylphloroglucinol (DAPG) are biocontrol agents of a variety of soilborne pathogens. DAPG is active against a broad spectrum of organisms ranging from bacteria to higher plants. This suggests that the antibiotic may target basic...
Nanoplasmonic Catalysis for Synthetic Fuel Production
2010-02-22
understanding of the basic mechanism underlying this enhancement with the ultimate goal of producing synthetic fuels, such as hydrogen , methane and...of producing synthetic fuels, such as hydrogen , methane and methanol using visible illumination. Objectives: - Fabricate arrays of metal...in our energy infrastructure. For photocatalysis , this area is especially exciting because it presents a possible route to direct solar-to-fuel
ERIC Educational Resources Information Center
Frank, Boris
1970-01-01
There are problems in adult basic education--little specially produced professional material available to it; little time spent on preplanning stages; severe budgetary limitations; and an elementary school syndrome. (EB)
Basic physics of laser interaction with vital tissue.
Wigdor, Harvey
2008-09-01
It is essential for any practitioner who uses lasers in their clinical practice to understand the basic physics of lasers. It is this knowledge that allows for an educated assessment of the clinical outcomes that lasers produce in our patients. It is also this understanding that provides a scientific basis for the visual feedback the clinician uses to vary parameters as needed to get the desired clinical results. It is the intent of this paper to discuss the very basic reasons why lasers affect tissues the way they do, and to synthesize the plethora of information dental practitioners are seeing regularly in dental journals.
The SPARC Intercomparison of Middle Atmosphere Climatologies
NASA Technical Reports Server (NTRS)
Randel, William; Fleming, Eric; Geller, Marvin; Gelman, Mel; Hamilton, Kevin; Karoly, David; Ortland, Dave; Pawson, Steve; Swinbank, Richard; Udelhofen, Petra
2003-01-01
Our current confidence in 'observed' climatological winds and temperatures in the middle atmosphere (over altitudes approx. 10-80 km) is assessed by detailed intercomparisons of contemporary and historic data sets. These data sets include global meteorological analyses and assimilations, climatologies derived from research satellite measurements, and historical reference atmosphere circulation statistics. We also include comparisons with historical rocketsonde wind and temperature data, and with more recent lidar temperature measurements. The comparisons focus on a few basic circulation statistics, such as temperature, zonal wind, and eddy flux statistics. Special attention is focused on tropical winds and temperatures, where large differences exist among separate analyses. Assimilated data sets provide the most realistic tropical variability, but substantial differences exist among current schemes.
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
Philosophers assess randomized clinical trials: the need for dialogue.
Miké, V
1989-09-01
In recent years a growing number of professional philosophers have joined in the controversy over ethical aspects of randomized clinical trials (RCTs). Morally questionable in their utilitarian approach, RCTs are claimed by some to be in direct violation of the second form of Kant's Categorical Imperative. But the arguments used in these critiques at times derive from a lack of insight into basic statistical procedures and the realities of the biomedical research process. Presented to physicians and other nonspecialists, including the lay public, such distortions can be harmful. Given the great complexity of statistical methodology and the anomalous nature of concepts of evidence, more sustained input into the interdisciplinary dialogue is needed from the statistical profession.
The Effect of Student-Driven Projects on the Development of Statistical Reasoning
ERIC Educational Resources Information Center
Sovak, Melissa M.
2010-01-01
Research has shown that even if students pass a standard introductory statistics course, they often still lack the ability to reason statistically. Many instructional techniques for enhancing the development of statistical reasoning have been discussed, although there is often little to no experimental evidence that they produce effective results…
Choi, Mankyu; Lee, Keon-Hyung
2008-01-01
In this study, the determinants of hospital profitability were evaluated using a sample of 142 hospitals that had undergone hospital standardization inspections by the South Korea Hospital Association over the 4-year period from 1998 to 2001. The measures of profitability used as dependent variables in this study were pretax return on assets, after-tax return on assets, basic earning power, pretax operating margin, and after-tax operating margin. Among those determinants, it was found that ownership type, teaching status, inventory turnover, and the average charge per adjusted inpatient day positively and statistically significantly affected all 5 of these profitability measures. However, the labor expenses per adjusted inpatient day and administrative expenses per adjusted inpatient day negatively and statistically significantly affected all 5 profitability measures. The debt ratio negatively and statistically significantly affected all 5 profitability measures, with the exception of basic earning power. None of the market factors assessed were shown to significantly affect profitability. In conclusion, the results of this study suggest that the profitability of hospitals can be improved despite deteriorating external environmental conditions by facilitating the formation of sound financial structures with optimal capital supplies, optimizing the management of total assets with special emphasis placed on inventory management, and introducing efficient control of fixed costs including labor and administrative expenses.
Finite Element Analysis of Reverberation Chambers
NASA Technical Reports Server (NTRS)
Bunting, Charles F.; Nguyen, Duc T.
2000-01-01
The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.
Storytelling, statistics and hereditary thought: the narrative support of early statistics.
López-Beltrán, Carlos
2006-03-01
This paper's main contention is that some basically methodological developments in science which are apparently distant and unrelated can be seen as part of a sequential story. Focusing on general inferential and epistemological matters, the paper links occurrences separated by both in time and space, by formal and representational issues rather than social or disciplinary links. It focuses on a few limited aspects of several cognitive practices in medical and biological contexts separated by geography, disciplines and decades, but connected by long term transdisciplinary representational and inferential structures and constraints. The paper intends to show a given set of knowledge claims based on organizing statistically empirical data can be seen to have been underpinned by a previous, more familiar, and probably more natural, narrative handling of similar evidence. To achieve that this paper moves from medicine in France in the late eighteenth and early nineteenth century to the second half of the nineteenth century in England among gentleman naturalists, following its subject: the shift from narrative depiction of hereditary transmission of physical peculiarities to posterior statistical articulations of the same phenomena. Some early defenders of heredity as an important (if not the most important) causal presence in the understanding of life adopted singular narratives, in the form of case stories from medical and natural history traditions, to flesh out a special kind of causality peculiar to heredity. This work tries to reconstruct historically the rationale that drove the use of such narratives. It then shows that when this rationale was methodologically challenged, its basic narrative and probabilistic underpinings were transferred to the statistical quantificational tools that took their place.
A spring-block analogy for the dynamics of stock indexes
NASA Astrophysics Data System (ADS)
Sándor, Bulcsú; Néda, Zoltán
2015-06-01
A spring-block chain placed on a running conveyor belt is considered for modeling stylized facts observed in the dynamics of stock indexes. Individual stocks are modeled by the blocks, while the stock-stock correlations are introduced via simple elastic forces acting in the springs. The dragging effect of the moving belt corresponds to the expected economic growth. The spring-block system produces collective behavior and avalanche like phenomena, similar to the ones observed in stock markets. An artificial index is defined for the spring-block chain, and its dynamics is compared with the one measured for the Dow Jones Industrial Average. For certain parameter regions the model reproduces qualitatively well the dynamics of the logarithmic index, the logarithmic returns, the distribution of the logarithmic returns, the avalanche-size distribution and the distribution of the investment horizons. A noticeable success of the model is that it is able to account for the gain-loss asymmetry observed in the inverse statistics. Our approach has mainly a pedagogical value, bridging between a complex socio-economic phenomena and a basic (mechanical) model in physics.
Large Eddy Simulation of Turbulent Flow in a Ribbed Pipe
NASA Astrophysics Data System (ADS)
Kang, Changwoo; Yang, Kyung-Soo
2011-11-01
Turbulent flow in a pipe with periodically wall-mounted ribs has been investigated by large eddy simulation with a dynamic subgrid-scale model. The value of Re considered is 98,000, based on hydraulic diameter and mean bulk velocity. An immersed boundary method was employed to implement the ribs in the computational domain. The spacing of the ribs is the key parameter to produce the d-type, intermediate and k-type roughness flows. The mean velocity profiles and turbulent intensities obtained from the present LES are in good agreement with the experimental measurements currently available. Turbulence statistics, including budgets of the Reynolds stresses, were computed, and analyzed to elucidate turbulence structures, especially around the ribs. In particular, effects of the ribs are identified by comparing the turbulence structures with those of smooth pipe flow. The present investigation is relevant to the erosion/corrosion that often occurs around a protruding roughness in a pipe system. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (2010-0008457).
Simulation of root forms using cellular automata model
NASA Astrophysics Data System (ADS)
Winarno, Nanang; Prima, Eka Cahya; Afifah, Ratih Mega Ayu
2016-02-01
This research aims to produce a simulation program for root forms using cellular automata model. Stephen Wolfram in his book entitled "A New Kind of Science" discusses the formation rules based on the statistical analysis. In accordance with Stephen Wolfram's investigation, the research will develop a basic idea of computer program using Delphi 7 programming language. To best of our knowledge, there is no previous research developing a simulation describing root forms using the cellular automata model compared to the natural root form with the presence of stone addition as the disturbance. The result shows that (1) the simulation used four rules comparing results of the program towards the natural photographs and each rule had shown different root forms; (2) the stone disturbances prevent the root growth and the multiplication of root forms had been successfully modeled. Therefore, this research had added some stones, which have size of 120 cells placed randomly in the soil. Like in nature, stones cannot be penetrated by plant roots. The result showed that it is very likely to further develop the program of simulating root forms by 50 variations.