ERIC Educational Resources Information Center
Williams, Immanuel James; Williams, Kelley Kim
2016-01-01
Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.
A Multidisciplinary Approach for Teaching Statistics and Probability
ERIC Educational Resources Information Center
Rao, C. Radhakrishna
1971-01-01
The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less
Data Analysis Techniques for Physical Scientists
NASA Astrophysics Data System (ADS)
Pruneau, Claude A.
2017-10-01
Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.
Analysis and Interpretation of Findings Using Multiple Regression Techniques
ERIC Educational Resources Information Center
Hoyt, William T.; Leierer, Stephen; Millington, Michael J.
2006-01-01
Multiple regression and correlation (MRC) methods form a flexible family of statistical techniques that can address a wide variety of different types of research questions of interest to rehabilitation professionals. In this article, we review basic concepts and terms, with an emphasis on interpretation of findings relevant to research questions…
Probability sampling in legal cases: Kansas cellphone users
NASA Astrophysics Data System (ADS)
Kadane, Joseph B.
2012-10-01
Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.
ERIC Educational Resources Information Center
Tighe, Elizabeth L.; Schatschneider, Christopher
2016-01-01
The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological…
Statistical techniques for sampling and monitoring natural resources
Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado
2004-01-01
We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....
Antweiler, Ronald C.; Taylor, Howard E.
2008-01-01
The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.
Oral health literacy: awareness and practices among pediatric dentists.
Stowers, Megan E; Lee, Jessica Y; Majewski, Robert F; Estrella, Maria Regina P; Taylor, George W; Boynton, James R
2013-01-01
The purpose of this study was to examine pediatric dentists' awareness and experiences with oral health literacy and to identify communication techniques used with parents. Active North American members of the American Academy of Pediatric Dentistry were invited to participate in the survey. Descriptive statistical analyses were completed, and Pearson's chi-square crosstabs tests were used to compare categorical data between groups. Data were collected from 22 percent (N=1,059) of pediatric dentists; 68 to 87 percent use basic communication techniques routinely, while 36 to 79 percent routinely use enhanced communication techniques. Approximately 59 percent (N=620) reported having had an experience with health literacy miscommunication, while 11 percent (N=116) are aware of an error in patient care that resulted from oral health literacy miscommunication. Respondents who have had an experience with miscommunication were significantly more likely statistically to perceive barriers to effective communication as more significant than those without a history of miscommunication experience (P<.001). Most pediatric dentists have experienced situations in which a parent has misunderstood information. Basic communication techniques were most commonly used, while enhanced communication techniques were used less routinely. Those who have had experience with oral health literacy miscommunication events perceive barriers to effective communication as more significant.
[Introduction to Exploratory Factor Analysis (EFA)].
Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón
2012-03-01
Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Basic principles of Hasse diagram technique in chemistry.
Brüggemann, Rainer; Voigt, Kristina
2008-11-01
Principles of partial order applied to ranking are explained. The Hasse diagram technique (HDT) is the application of partial order theory based on a data matrix. In this paper, HDT is introduced in a stepwise procedure, and some elementary theorems are exemplified. The focus is to show how the multivariate character of a data matrix is realized by HDT and in which cases one should apply other mathematical or statistical methods. Many simple examples illustrate the basic theoretical ideas. Finally, it is shown that HDT is a useful alternative for the evaluation of antifouling agents, which was originally performed by amoeba diagrams.
Techniques of Differentiation and Integration, Mathematics (Experimental): 5297.27.
ERIC Educational Resources Information Center
Forrester, Gary B.
This guidebook on minimum course content was designed for students who have mastered the skills and concepts of analytic geometry. It is a short course in the basic techniques of calculus recommended for the student who has need of these skills in other courses such as beginning physics, economics or statistics. The course does not intend to teach…
Analysis of Variance in Statistical Image Processing
NASA Astrophysics Data System (ADS)
Kurz, Ludwik; Hafed Benteftifa, M.
1997-04-01
A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
A quantitative comparison of corrective and perfective maintenance
NASA Technical Reports Server (NTRS)
Henry, Joel; Cain, James
1994-01-01
This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.
Agundu, Prince Umor C
2003-01-01
Public health dispensaries in Nigeria in recent times have demonstrated the poise to boost corporate productivity in the new millennium and to drive the nation closer to concretising the lofty goal of health-for-all. This is very pronounced considering the face-lift giving to the physical environment, increase in the recruitment and development of professionals, and upward review of financial subventions. However, there is little or no emphasis on basic statistical appreciation/application which enhances the decision making ability of corporate executives. This study used the responses from 120 senior public health officials in Nigeria and analyzed them with chi-square statistical technique. The results established low statistical aptitude, inadequate statistical training programmes, little/no emphasis on statistical literacy compared to computer literacy, amongst others. Consequently, it was recommended that these lapses be promptly addressed to enhance official executive performance in the establishments. Basic statistical data presentation typologies have been articulated in this study to serve as first-aid instructions to the target group, as they represent the contributions of eminent scholars in this area of intellectualism.
NASA Technical Reports Server (NTRS)
Bunting, Charles F.; Yu, Shih-Pin
2006-01-01
This paper emphasizes the application of numerical methods to explore the ideas related to shielding effectiveness from a statistical view. An empty rectangular box is examined using a hybrid modal/moment method. The basic computational method is presented followed by the results for single- and multiple observation points within the over-moded empty structure. The statistics of the field are obtained by using frequency stirring, borrowed from the ideas connected with reverberation chamber techniques, and extends the ideas of shielding effectiveness well into the multiple resonance regions. The study presented in this paper will address the average shielding effectiveness over a broad spatial sample within the enclosure as the frequency is varied.
Adaptive statistical pattern classifiers for remotely sensed data
NASA Technical Reports Server (NTRS)
Gonzalez, R. C.; Pace, M. O.; Raulston, H. S.
1975-01-01
A technique for the adaptive estimation of nonstationary statistics necessary for Bayesian classification is developed. The basic approach to the adaptive estimation procedure consists of two steps: (1) an optimal stochastic approximation of the parameters of interest and (2) a projection of the parameters in time or position. A divergence criterion is developed to monitor algorithm performance. Comparative results of adaptive and nonadaptive classifier tests are presented for simulated four dimensional spectral scan data.
General Nature of Multicollinearity in Multiple Regression Analysis.
ERIC Educational Resources Information Center
Liu, Richard
1981-01-01
Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)
[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
Stats for Scaredy-Cats: A How-To Guide for Rural Data Users.
ERIC Educational Resources Information Center
Center for Rural Pennsylvania, Harrisburg.
This guide provides some basic statistical techniques that may be used in writing grant proposals, analyzing reports, and evaluating programs. Examples focus on rural Pennsylvania. The first section, Understanding Data, discusses definitions, codes, and data limitations. Definitions of Census Bureau geographic, demographic, housing, and…
New Statistical Techniques for Evaluating Longitudinal Models.
ERIC Educational Resources Information Center
Murray, James R.; Wiley, David E.
A basic methodological approach in developmental studies is the collection of longitudinal data. Behavioral data cen take at least two forms, qualitative (or discrete) and quantitative. Both types are fallible. Measurement errors can occur in quantitative data and measures of these are based on error variance. Qualitative or discrete data can…
Reinventing Biostatistics Education for Basic Scientists
Weissgerber, Tracey L.; Garovic, Vesna D.; Milin-Lazovic, Jelena S.; Winham, Stacey J.; Obradovic, Zoran; Trzeciakowski, Jerome P.; Milic, Natasa M.
2016-01-01
Numerous studies demonstrating that statistical errors are common in basic science publications have led to calls to improve statistical training for basic scientists. In this article, we sought to evaluate statistical requirements for PhD training and to identify opportunities for improving biostatistics education in the basic sciences. We provide recommendations for improving statistics training for basic biomedical scientists, including: 1. Encouraging departments to require statistics training, 2. Tailoring coursework to the students’ fields of research, and 3. Developing tools and strategies to promote education and dissemination of statistical knowledge. We also provide a list of statistical considerations that should be addressed in statistics education for basic scientists. PMID:27058055
Powerlaw: a Python package for analysis of heavy-tailed distributions.
Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar
2014-01-01
Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.
NASA Astrophysics Data System (ADS)
Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio
2017-04-01
Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic and drought statistic of the historical data. A multi-objective analysis using basic statistics (mean, standard deviation and asymmetry coefficient) and droughts statistics (duration, magnitude and intensity) has been performed to identify which models are better in terms of goodness of fit to reproduce the historical series. The drought statistics have been obtained from the Standard Precipitation index (SPI) series using the Theory of Runs. This analysis allows discriminate the best RCM and the best combination of model and correction technique in the bias-correction method. We have also analyzed the possibilities of using different Stochastic Weather Generators to approximate the basic and droughts statistics of the historical series. These analyses have been performed in our case study in a lumped and in a distributed way in order to assess its sensibility to the spatial scale. The statistic of the future temperature series obtained with different ensemble options are quite homogeneous, but the precipitation shows a higher sensibility to the adopted method and spatial scale. The global increment in the mean temperature values are 31.79 %, 31.79 %, 31.03 % and 31.74 % for the distributed bias-correction, distributed delta-change, lumped bias-correction and lumped delta-change ensembles respectively and in the precipitation they are -25.48 %, -28.49 %, -26.42 % and -27.35% respectively. Acknowledgments: This research work has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02 and CORDEX projects for the data provided for this study and the R package qmap.
Use of communication techniques by Maryland dentists.
Maybury, Catherine; Horowitz, Alice M; Wang, Min Qi; Kleinman, Dushanka V
2013-12-01
Health care providers' use of recommended communication techniques can increase patients' adherence to prevention and treatment regimens and improve patient health outcomes. The authors conducted a survey of Maryland dentists to determine the number and type of communication techniques they use on a routine basis. The authors mailed a 30-item questionnaire to a random sample of 1,393 general practice dentists and all 169 members of the Maryland chapter of the American Academy of Pediatric Dentistry. The overall response rate was 38.4 percent. Analysis included descriptive statistics, analysis of variance and ordinary least squares regression analysis to examine the association of dentists' characteristics with the number of communication techniques used. They set the significance level at P < .05. General dentists reported routinely using a mean of 7.9 of the 18 communication techniques and 3.6 of the seven basic techniques, whereas pediatric dentists reported using a mean of 8.4 and 3.8 of those techniques, respectively. General dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .01) but not the seven basic techniques (P < .05). Pediatric dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .05) and the seven basic techniques (P < .01). The number of communication techniques that dentists used routinely varied across the 18 techniques and was low for most techniques. Practical Implications. Professional education is needed both in dental school curricula and continuing education courses to increase use of recommended communication techniques. Specifically, dentists and their team members should consider taking communication skills courses and conducting an overall evaluation of their practices for user friendliness.
Multilevel modelling: Beyond the basic applications.
Wright, Daniel B; London, Kamala
2009-05-01
Over the last 30 years statistical algorithms have been developed to analyse datasets that have a hierarchical/multilevel structure. Particularly within developmental and educational psychology these techniques have become common where the sample has an obvious hierarchical structure, like pupils nested within a classroom. We describe two areas beyond the basic applications of multilevel modelling that are important to psychology: modelling the covariance structure in longitudinal designs and using generalized linear multilevel modelling as an alternative to methods from signal detection theory (SDT). Detailed code for all analyses is described using packages for the freeware R.
NASA Astrophysics Data System (ADS)
Wright, Robyn; Thornberg, Steven M.
SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.
Basic biostatistics for post-graduate students
Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.
2012-01-01
Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
Image analysis library software development
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Bryant, J.
1977-01-01
The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.
An exploratory investigation of weight estimation techniques for hypersonic flight vehicles
NASA Technical Reports Server (NTRS)
Cook, E. L.
1981-01-01
The three basic methods of weight prediction (fixed-fraction, statistical correlation, and point stress analysis) and some of the computer programs that have been developed to implement them are discussed. A modified version of the WAATS (Weights Analysis of Advanced Transportation Systems) program is presented, along with input data forms and an example problem.
ERIC Educational Resources Information Center
Ammentorp, William
There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…
Koo, Laura W.; Horowitz, Alice M.; Radice, Sarah D.; Wang, Min Q.; Kleinman, Dushanka V.
2016-01-01
Objectives We examined nurse practitioners’ use and opinions of recommended communication techniques for the promotion of oral health as part of a Maryland state-wide oral health literacy assessment. Use of recommended health-literate and patient-centered communication techniques have demonstrated improved health outcomes. Methods A 27-item self-report survey, containing 17 communication technique items, across 5 domains, was mailed to 1,410 licensed nurse practitioners (NPs) in Maryland in 2010. Use of communication techniques and opinions about their effectiveness were analyzed using descriptive statistics. General linear models explored provider and practice characteristics to predict differences in the total number and the mean number of communication techniques routinely used in a week. Results More than 80% of NPs (N = 194) routinely used 3 of the 7 basic communication techniques: simple language, limiting teaching to 2–3 concepts, and speaking slowly. More than 75% of respondents believed that 6 of the 7 basic communication techniques are effective. Sociodemographic provider characteristics and practice characteristics were not significant predictors of the mean number or the total number of communication techniques routinely used by NPs in a week. Potential predictors for using more of the 7 basic communication techniques, demonstrating significance in one general linear model each, were: assessing the office for user-friendliness and ever taking a communication course in addition to nursing school. Conclusions NPs in Maryland self-reported routinely using some recommended health-literate communication techniques, with belief in their effectiveness. Our findings suggest that NPs who had assessed the office for patient-friendliness or who had taken a communication course beyond their initial education may be predictors for using more of the 7 basic communication techniques. These self-reported findings should be validated with observational studies. Graduate and continuing education for NPs should increase emphasis on health-literate and patient-centered communication techniques to increase patient understanding of dental caries prevention. Non-dental healthcare providers, such as NPs, are uniquely positioned to contribute to preventing early childhood dental caries through health-literate and patient-centered communication. PMID:26766557
Koo, Laura W; Horowitz, Alice M; Radice, Sarah D; Wang, Min Q; Kleinman, Dushanka V
2016-01-01
We examined nurse practitioners' use and opinions of recommended communication techniques for the promotion of oral health as part of a Maryland state-wide oral health literacy assessment. Use of recommended health-literate and patient-centered communication techniques have demonstrated improved health outcomes. A 27-item self-report survey, containing 17 communication technique items, across 5 domains, was mailed to 1,410 licensed nurse practitioners (NPs) in Maryland in 2010. Use of communication techniques and opinions about their effectiveness were analyzed using descriptive statistics. General linear models explored provider and practice characteristics to predict differences in the total number and the mean number of communication techniques routinely used in a week. More than 80% of NPs (N = 194) routinely used 3 of the 7 basic communication techniques: simple language, limiting teaching to 2-3 concepts, and speaking slowly. More than 75% of respondents believed that 6 of the 7 basic communication techniques are effective. Sociodemographic provider characteristics and practice characteristics were not significant predictors of the mean number or the total number of communication techniques routinely used by NPs in a week. Potential predictors for using more of the 7 basic communication techniques, demonstrating significance in one general linear model each, were: assessing the office for user-friendliness and ever taking a communication course in addition to nursing school. NPs in Maryland self-reported routinely using some recommended health-literate communication techniques, with belief in their effectiveness. Our findings suggest that NPs who had assessed the office for patient-friendliness or who had taken a communication course beyond their initial education may be predictors for using more of the 7 basic communication techniques. These self-reported findings should be validated with observational studies. Graduate and continuing education for NPs should increase emphasis on health-literate and patient-centered communication techniques to increase patient understanding of dental caries prevention. Non-dental healthcare providers, such as NPs, are uniquely positioned to contribute to preventing early childhood dental caries through health-literate and patient-centered communication.
On the Optimization of Aerospace Plane Ascent Trajectory
NASA Astrophysics Data System (ADS)
Al-Garni, Ahmed; Kassem, Ayman Hamdy
A hybrid heuristic optimization technique based on genetic algorithms and particle swarm optimization has been developed and tested for trajectory optimization problems with multi-constraints and a multi-objective cost function. The technique is used to calculate control settings for two types for ascending trajectories (constant dynamic pressure and minimum-fuel-minimum-heat) for a two-dimensional model of an aerospace plane. A thorough statistical analysis is done on the hybrid technique to make comparisons with both basic genetic algorithms and particle swarm optimization techniques with respect to convergence and execution time. Genetic algorithm optimization showed better execution time performance while particle swarm optimization showed better convergence performance. The hybrid optimization technique, benefiting from both techniques, showed superior robust performance compromising convergence trends and execution time.
Detector noise statistics in the non-linear regime
NASA Technical Reports Server (NTRS)
Shopbell, P. L.; Bland-Hawthorn, J.
1992-01-01
The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.
Finite Element Analysis of Reverberation Chambers
NASA Technical Reports Server (NTRS)
Bunting, Charles F.; Nguyen, Duc T.
2000-01-01
The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.
Estimating the Latent Number of Types in Growing Corpora with Reduced Cost-Accuracy Trade-Off
ERIC Educational Resources Information Center
Hidaka, Shohei
2016-01-01
The number of unique words in children's speech is one of most basic statistics indicating their language development. We may, however, face difficulties when trying to accurately evaluate the number of unique words in a child's growing corpus over time with a limited sample size. This study proposes a novel technique to estimate the latent number…
da Silva, R C V; de Sá, C C; Pascual-Vaca, Á O; de Souza Fontes, L H; Herbella Fernandes, F A M; Dib, R A; Blanco, C R; Queiroz, R A; Navarro-Rodriguez, T
2013-07-01
The treatment of gastroesophageal reflux disease may be clinical or surgical. The clinical consists basically of the use of drugs; however, there are new techniques to complement this treatment, osteopathic intervention in the diaphragmatic muscle is one these. The objective of the study is to compare pressure values in the examination of esophageal manometry of the lower esophageal sphincter (LES) before and immediately after osteopathic intervention in the diaphragm muscle. Thirty-eight patients with gastroesophageal reflux disease - 16 submitted to sham technique and 22 submitted osteopathic technique - were randomly selected. The average respiratory pressure (ARP) and the maximum expiratory pressure (MEP) of the LES were measured by manometry before and after osteopathic technique at the point of highest pressure. Statistical analysis was performed using the Student's t-test and Mann-Whitney, and magnitude of the technique proposed was measured using the Cohen's index. Statistically significant difference in the osteopathic technique was found in three out of four in relation to the group of patients who performed the sham technique for the following measures of LES pressure: ARP with P= 0.027. The MEP had no statistical difference (P= 0.146). The values of Cohen d for the same measures were: ARP with d= 0.80 and MEP d= 0.52. Osteopathic manipulative technique produces a positive increment in the LES region soon after its performance. © 2012 Copyright the Authors. Journal compilation © 2012, Wiley Periodicals, Inc. and the International Society for Diseases of the Esophagus.
NASA Technical Reports Server (NTRS)
Coggeshall, M. E.; Hoffer, R. M.
1973-01-01
Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.
NASA Technical Reports Server (NTRS)
Sowers, J.; Mehrotra, R.; Sethi, I. K.
1989-01-01
A method for extracting road boundaries using the monochrome image of a visual road scene is presented. The statistical information regarding the intensity levels present in the image along with some geometrical constraints concerning the road are the basics of this approach. Results and advantages of this technique compared to others are discussed. The major advantages of this technique, when compared to others, are its ability to process the image in only one pass, to limit the area searched in the image using only knowledge concerning the road geometry and previous boundary information, and dynamically adjust for inconsistencies in the located boundary information, all of which helps to increase the efficacy of this technique.
ERIC Educational Resources Information Center
Noser, Thomas C.; Tanner, John R.; Shah, Situl
2008-01-01
The purpose of this study was to measure the comprehension of basic mathematical skills of students enrolled in statistics classes at a large regional university, and to determine if the scores earned on a basic math skills test are useful in forecasting student performance in these statistics classes, and to determine if students' basic math…
NASA Technical Reports Server (NTRS)
Cull, R. C.; Eltimsahy, A. H.
1983-01-01
The present investigation is concerned with the formulation of energy management strategies for stand-alone photovoltaic (PV) systems, taking into account a basic control algorithm for a possible predictive, (and adaptive) controller. The control system controls the flow of energy in the system according to the amount of energy available, and predicts the appropriate control set-points based on the energy (insolation) available by using an appropriate system model. Aspects of adaptation to the conditions of the system are also considered. Attention is given to a statistical analysis technique, the analysis inputs, the analysis procedure, and details regarding the basic control algorithm.
Simplified estimation of age-specific reference intervals for skewed data.
Wright, E M; Royston, P
1997-12-30
Age-specific reference intervals are commonly used in medical screening and clinical practice, where interest lies in the detection of extreme values. Many different statistical approaches have been published on this topic. The advantages of a parametric method are that they necessarily produce smooth centile curves, the entire density is estimated and an explicit formula is available for the centiles. The method proposed here is a simplified version of a recent approach proposed by Royston and Wright. Basic transformations of the data and multiple regression techniques are combined to model the mean, standard deviation and skewness. Using these simple tools, which are implemented in almost all statistical computer packages, age-specific reference intervals may be obtained. The scope of the method is illustrated by fitting models to several real data sets and assessing each model using goodness-of-fit techniques.
CORSSA: Community Online Resource for Statistical Seismicity Analysis
NASA Astrophysics Data System (ADS)
Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.
2011-12-01
Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.
Predicting Macroscale Effects Through Nanoscale Features
2012-01-01
errors become incorrectly computed by the basic OLS technique. To test for the presence of heteroscedasticity the Breusch - Pagan / Cook-Weisberg test ...is employed with the test statistics distributed as 2 with the degrees of freedom equal to the number of regressors. The Breusch - Pagan / Cook...between shock sensitivity and Sm does not exhibit any heteroscedasticity. The Breusch - Pagan / Cook-Weisberg test provides 2(1)=1.73, which
Current genetic methodologies in the identification of disaster victims and in forensic analysis.
Ziętkiewicz, Ewa; Witt, Magdalena; Daca, Patrycja; Zebracka-Gala, Jadwiga; Goniewicz, Mariusz; Jarząb, Barbara; Witt, Michał
2012-02-01
This review presents the basic problems and currently available molecular techniques used for genetic profiling in disaster victim identification (DVI). The environmental conditions of a mass disaster often result in severe fragmentation, decomposition and intermixing of the remains of victims. In such cases, traditional identification based on the anthropological and physical characteristics of the victims is frequently inconclusive. This is the reason why DNA profiling became the gold standard for victim identification in mass-casualty incidents (MCIs) or any forensic cases where human remains are highly fragmented and/or degraded beyond recognition. The review provides general information about the sources of genetic material for DNA profiling, the genetic markers routinely used during genetic profiling (STR markers, mtDNA and single-nucleotide polymorphisms [SNP]) and the basic statistical approaches used in DNA-based disaster victim identification. Automated technological platforms that allow the simultaneous analysis of a multitude of genetic markers used in genetic identification (oligonucleotide microarray techniques and next-generation sequencing) are also presented. Forensic and population databases containing information on human variability, routinely used for statistical analyses, are discussed. The final part of this review is focused on recent developments, which offer particularly promising tools for forensic applications (mRNA analysis, transcriptome variation in individuals/populations and genetic profiling of specific cells separated from mixtures).
NASA Astrophysics Data System (ADS)
Walz, Michael; Leckebusch, Gregor C.
2016-04-01
Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.
Software for Data Analysis with Graphical Models
NASA Technical Reports Server (NTRS)
Buntine, Wray L.; Roy, H. Scott
1994-01-01
Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
Evaluation of wind field statistics near and inside clouds using a coherent Doppler lidar
NASA Astrophysics Data System (ADS)
Lottman, Brian Todd
1998-09-01
This work proposes advanced techniques for measuring the spatial wind field statistics near and inside clouds using a vertically pointing solid state coherent Doppler lidar on a fixed ground based platform. The coherent Doppler lidar is an ideal instrument for high spatial and temporal resolution velocity estimates. The basic parameters of lidar are discussed, including a complete statistical description of the Doppler lidar signal. This description is extended to cases with simple functional forms for aerosol backscatter and velocity. An estimate for the mean velocity over a sensing volume is produced by estimating the mean spectra. There are many traditional spectral estimators, which are useful for conditions with slowly varying velocity and backscatter. A new class of estimators (novel) is introduced that produces reliable velocity estimates for conditions with large variations in aerosol backscatter and velocity with range, such as cloud conditions. Performance of traditional and novel estimators is computed for a variety of deterministic atmospheric conditions using computer simulated data. Wind field statistics are produced for actual data for a cloud deck, and for multi- layer clouds. Unique results include detection of possible spectral signatures for rain, estimates for the structure function inside a cloud deck, reliable velocity estimation techniques near and inside thin clouds, and estimates for simple wind field statistics between cloud layers.
Hayat, Matthew J.; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L.
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals. PMID:28591190
Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.
Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)
NASA Astrophysics Data System (ADS)
Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee
2010-12-01
Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review available statistical seismology software packages.
The Use of Recommended Communication Techniques by Maryland Family Physicians and Pediatricians
Weatherspoon, Darien J.; Horowitz, Alice M.; Kleinman, Dushanka V.; Wang, Min Qi
2015-01-01
Background Health literacy experts and the American Medical Association have developed recommended communication techniques for healthcare providers given that effective communication has been shown to greatly improve health outcomes. The purpose of this study was to determine the number and types of communication techniques routinely used by Maryland physicians. Methods In 2010, a 30-item survey was mailed to a random sample of 1,472 Maryland family physicians and pediatricians, with 294 surveys being returned and usable. The survey contained questions about provider and practice characteristics, and 17 items related to communication techniques, including seven basic communication techniques. Physicians’ use of recommended communication techniques was analyzed using descriptive statistics, analysis of variance, and ordinary least squares regression. Results Family physicians routinely used an average of 6.6 of the 17 total techniques and 3.3 of the seven basic techniques, whereas pediatricians routinely used 6.4 and 3.2 techniques, respectively. The use of simple language was the only technique that nearly all physicians routinely utilized (Family physicians, 91%; Pediatricians, 93%). Physicians who had taken a communications course used significantly more techniques than those who had not. Physicians with a low percentage of patients on Medicaid were significantly less likely to use the recommended communication techniques compared to those providers who had high proportion of their patient population on Medicaid. Conclusions Overall, the use of recommended communication techniques was low. Additionally, many physicians were unsure of the effectiveness of several of the recommended techniques, which could suggest that physicians are unaware of valuable skills that could enhance their communication. The findings of this study suggest that communications training should be given a higher priority in the medical training process in the United States. PMID:25856371
Soft errors in commercial off-the-shelf static random access memories
NASA Astrophysics Data System (ADS)
Dilillo, L.; Tsiligiannis, G.; Gupta, V.; Bosser, A.; Saigne, F.; Wrobel, F.
2017-01-01
This article reviews state-of-the-art techniques for the evaluation of the effect of radiation on static random access memory (SRAM). We detailed irradiation test techniques and results from irradiation experiments with several types of particles. Two commercial SRAMs, in 90 and 65 nm technology nodes, were considered as case studies. Besides the basic static and dynamic test modes, advanced stimuli for the irradiation tests were introduced, as well as statistical post-processing techniques allowing for deeper analysis of the correlations between bit-flip cross-sections and design/architectural characteristics of the memory device. Further insight is provided on the response of irradiated stacked layer devices and on the use of characterized SRAM devices as particle detectors.
Using SERVQUAL and Kano research techniques in a patient service quality survey.
Christoglou, Konstantinos; Vassiliadis, Chris; Sigalas, Ioakim
2006-01-01
This article presents the results of a service quality study. After an introduction to the SERVQUAL and the Kano research techniques, a Kano analysis of 75 patients from the General Hospital of Katerini in Greece is presented. The service quality criterion used satisfaction and dissatisfaction indices. The Kano statistical analysis process results strengthened the hypothesis of previous research regarding the importance of personal knowledge, the courtesy of the hospital employees and their ability to convey trust and confidence (assurance dimension). Managerial suggestions are made regarding the best way of acting and approaching hospital patients based on the basic SERVQUAL model.
Statistics for wildlifers: how much and what kind?
Johnson, D.H.; Shaffer, T.L.; Newton, W.E.
2001-01-01
Quantitative methods are playing increasingly important roles in wildlife ecology and, ultimately, management. This change poses a challenge for wildlife practitioners and students who are not well-educated in mathematics and statistics. Here we give our opinions on what wildlife biologists should know about statistics, while recognizing that not everyone is inclined mathematically. For those who are, we recommend that they take mathematics coursework at least through calculus and linear algebra. They should take statistics courses that are focused conceptually , stressing the Why rather than the How of doing statistics. For less mathematically oriented wildlifers, introductory classes in statistical techniques will furnish some useful background in basic methods but may provide little appreciation of when the methods are appropriate. These wildlifers will have to rely much more on advice from statisticians. Far more important than knowing how to analyze data is an understanding of how to obtain and recognize good data. Regardless of the statistical education they receive, all wildlife biologists should appreciate the importance of controls, replication, and randomization in studies they conduct. Understanding these concepts requires little mathematical sophistication, but is critical to advancing the science of wildlife ecology.
Applied learning-based color tone mapping for face recognition in video surveillance system
NASA Astrophysics Data System (ADS)
Yew, Chuu Tian; Suandi, Shahrel Azmin
2012-04-01
In this paper, we present an applied learning-based color tone mapping technique for video surveillance system. This technique can be applied onto both color and grayscale surveillance images. The basic idea is to learn the color or intensity statistics from a training dataset of photorealistic images of the candidates appeared in the surveillance images, and remap the color or intensity of the input image so that the color or intensity statistics match those in the training dataset. It is well known that the difference in commercial surveillance cameras models, and signal processing chipsets used by different manufacturers will cause the color and intensity of the images to differ from one another, thus creating additional challenges for face recognition in video surveillance system. Using Multi-Class Support Vector Machines as the classifier on a publicly available video surveillance camera database, namely SCface database, this approach is validated and compared to the results of using holistic approach on grayscale images. The results show that this technique is suitable to improve the color or intensity quality of video surveillance system for face recognition.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
Stellar photometry with the Wide Field/Planetary Camera of the Hubble Space Telescope
NASA Astrophysics Data System (ADS)
Holtzman, Jon A.
1990-07-01
Simulations of Wide Field/Planetary Camera (WF/PC) images are analyzed in order to discover the most effective techniques for stellar photometry and to evaluate the accuracy and limitations of these techniques. The capabilities and operation of the WF/PC and the simulations employed in the study are described. The basic techniques of stellar photometry and methods to improve these techniques for the WF/PC are discussed. The correct parameters for star detection, aperture photometry, and point-spread function (PSF) fitting with the DAOPHOT software of Stetson (1987) are determined. Consideration is given to undersampling of the stellar images by the detector; variations in the PSF; and the crowding of the stellar images. It is noted that, with some changes DAOPHOT, is able to generate photometry almost to the level of photon statistics.
Fundamentals of nuclear medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alazraki, N.P.; Mishkin, F.S.
1988-01-01
The book begins with basic science and statistics relevant to nuclear medicine, and specific organ systems are addressed in separate chapters. A section of the text also covers imaging of groups of disease processes (eg, trauma, cancer). The authors present a comparison between nuclear medicine techniques and other diagnostic imaging studies. A table is given which comments on sensitivities and specificities of common nuclear medicine studies. The sensitivities and specificities are categorized as very high, high, moderate, and so forth.
Teaching Computational Geophysics Classes using Active Learning Techniques
NASA Astrophysics Data System (ADS)
Keers, H.; Rondenay, S.; Harlap, Y.; Nordmo, I.
2016-12-01
We give an overview of our experience in teaching two computational geophysics classes at the undergraduate level. In particular we describe The first class is for most students the first programming class and assumes that the students have had an introductory course in geophysics. In this class the students are introduced to basic Matlab skills: use of variables, basic array and matrix definition and manipulation, basic statistics, 1D integration, plotting of lines and surfaces, making of .m files and basic debugging techniques. All of these concepts are applied to elementary but important concepts in earthquake and exploration geophysics (including epicentre location, computation of travel time curves for simple layered media plotting of 1D and 2D velocity models etc.). It is important to integrate the geophysics with the programming concepts: we found that this enhances students' understanding. Moreover, as this is a 3 year Bachelor program, and this class is taught in the 2nd semester, there is little time for a class that focusses on only programming. In the second class, which is optional and can be taken in the 4th or 6th semester, but often is also taken by Master students we extend the Matlab programming to include signal processing and ordinary and partial differential equations, again with emphasis on geophysics (such as ray tracing and solving the acoustic wave equation). This class also contains a project in which the students have to write a brief paper on a topic in computational geophysics, preferably with programming examples. When teaching these classes it was found that active learning techniques, in which the students actively participate in the class, either individually, in pairs or in groups, are indispensable. We give a brief overview of the various activities that we have developed when teaching theses classes.
Space shuttle solid rocket booster recovery system definition, volume 1
NASA Technical Reports Server (NTRS)
1973-01-01
The performance requirements, preliminary designs, and development program plans for an airborne recovery system for the space shuttle solid rocket booster are discussed. The analyses performed during the study phase of the program are presented. The basic considerations which established the system configuration are defined. A Monte Carlo statistical technique using random sampling of the probability distribution for the critical water impact parameters was used to determine the failure probability of each solid rocket booster component as functions of impact velocity and component strength capability.
Compressing random microstructures via stochastic Wang tilings.
Novák, Jan; Kučerová, Anna; Zeman, Jan
2012-10-01
This Rapid Communication presents a stochastic Wang tiling-based technique to compress or reconstruct disordered microstructures on the basis of given spatial statistics. Unlike the existing approaches based on a single unit cell, it utilizes a finite set of tiles assembled by a stochastic tiling algorithm, thereby allowing to accurately reproduce long-range orientation orders in a computationally efficient manner. Although the basic features of the method are demonstrated for a two-dimensional particulate suspension, the present framework is fully extensible to generic multidimensional media.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
Teaching Basic Probability in Undergraduate Statistics or Management Science Courses
ERIC Educational Resources Information Center
Naidu, Jaideep T.; Sanford, John F.
2017-01-01
Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…
On prognostic models, artificial intelligence and censored observations.
Anand, S S; Hamilton, P W; Hughes, J G; Bell, D A
2001-03-01
The development of prognostic models for assisting medical practitioners with decision making is not a trivial task. Models need to possess a number of desirable characteristics and few, if any, current modelling approaches based on statistical or artificial intelligence can produce models that display all these characteristics. The inability of modelling techniques to provide truly useful models has led to interest in these models being purely academic in nature. This in turn has resulted in only a very small percentage of models that have been developed being deployed in practice. On the other hand, new modelling paradigms are being proposed continuously within the machine learning and statistical community and claims, often based on inadequate evaluation, being made on their superiority over traditional modelling methods. We believe that for new modelling approaches to deliver true net benefits over traditional techniques, an evaluation centric approach to their development is essential. In this paper we present such an evaluation centric approach to developing extensions to the basic k-nearest neighbour (k-NN) paradigm. We use standard statistical techniques to enhance the distance metric used and a framework based on evidence theory to obtain a prediction for the target example from the outcome of the retrieved exemplars. We refer to this new k-NN algorithm as Censored k-NN (Ck-NN). This reflects the enhancements made to k-NN that are aimed at providing a means for handling censored observations within k-NN.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Morphometrical study on senile larynx.
Zieliński, R
2001-01-01
The aim of the study was a morphometrical macroscopic evaluation of senile larynges, according to its usefulness in ORL diagnostic and operational methods. Larynx preparations were taken from cadavers of both sexes, of age 65 and over, about 24 hours after death. Clinically important laryngeal diameters were collected using common morphometrical methods. A few body features were also being gathered. Computer statistical methods were used in data assessment, including basic statistics and linear correlations between diameters and between diameters and body features. The data presented in the study may be very helpful in evaluation of diagnostic methods. It may also help in selection of right operational tool' sizes, the most appropriate operational technique choice, preoperative preparations and designing and building virtual and plastic models for physicians' training.
Treated cabin acoustic prediction using statistical energy analysis
NASA Technical Reports Server (NTRS)
Yoerkie, Charles A.; Ingraham, Steven T.; Moore, James A.
1987-01-01
The application of statistical energy analysis (SEA) to the modeling and design of helicopter cabin interior noise control treatment is demonstrated. The information presented here is obtained from work sponsored at NASA Langley for the development of analytic modeling techniques and the basic understanding of cabin noise. Utility and executive interior models are developed directly from existing S-76 aircraft designs. The relative importance of panel transmission loss (TL), acoustic leakage, and absorption to the control of cabin noise is shown using the SEA modeling parameters. It is shown that the major cabin noise improvement below 1000 Hz comes from increased panel TL, while above 1000 Hz it comes from reduced acoustic leakage and increased absorption in the cabin and overhead cavities.
Non-Markovian near-infrared Q branch of HCl diluted in liquid Ar.
Padilla, Antonio; Pérez, Justo
2013-08-28
By using a non-Markovian spectral theory based in the Kubo cumulant expansion technique, we have qualitatively studied the infrared Q branch observed in the fundamental absorption band of HCl diluted in liquid Ar. The statistical parameters of the anisotropic interaction present in this spectral theory were calculated by means of molecular dynamics techniques, and found that the values of the anisotropic correlation times are significantly greater (by a factor of two) than those previously obtained by fitting procedures or microscopic cell models. This fact is decisive for the observation in the theoretical spectral band of a central Q resonance which is absent in the abundant previous researches carried out with the usual theories based in Kubo cumulant expansion techniques. Although the theory used in this work only allows a qualitative study of the Q branch, we can employ it to study the unknown characteristics of the Q resonance which are difficult to obtain with the quantum simulation techniques recently developed. For example, in this study we have found that the Q branch is basically a non-Markovian (or memory) effect produced by the spectral line interferences, where the PR interferential profile basically determines the Q branch spectral shape. Furthermore, we have found that the Q resonance is principally generated by the first rotational states of the first two vibrational levels, those more affected by the action of the dissolvent.
A Comparison of Techniques for Camera Selection and Hand-Off in a Video Network
NASA Astrophysics Data System (ADS)
Li, Yiming; Bhanu, Bir
Video networks are becoming increasingly important for solving many real-world problems. Multiple video sensors require collaboration when performing various tasks. One of the most basic tasks is the tracking of objects, which requires mechanisms to select a camera for a certain object and hand-off this object from one camera to another so as to accomplish seamless tracking. In this chapter, we provide a comprehensive comparison of current and emerging camera selection and hand-off techniques. We consider geometry-, statistics-, and game theory-based approaches and provide both theoretical and experimental comparison using centralized and distributed computational models. We provide simulation and experimental results using real data for various scenarios of a large number of cameras and objects for in-depth understanding of strengths and weaknesses of these techniques.
de Castro, Alessandra Maia; de Oliveira, Fabiana Sodré; de Paiva Novaes, Myrian Stella; Araújo Ferreira, Danielly Cunha
2013-01-01
This study compared the parental acceptance of pediatric behavior guidance techniques (BGT). Forty parents of children without disabilities (Group A) and another 40 parents of children with disabilities (Group B) were selected. Each BGT was explained by a single examiner and it was presented together with a photograph album. After that parents evaluated the acceptance in: totally unacceptable, somewhat acceptable, acceptable, and totally acceptable. Results indicated that in Group A, the BGT based on communicative guidance was accepted by most participants. In Group B, just one mother considered totally unacceptable the voice control method and other two, tell-show-do. For both groups, the general anesthesia was the less accepted BGT. There was statistically significant difference in acceptance for protective stabilization with a restrictive device in Group B. Children's parents with and without disabilities accepted behavioral guidance techniques, but basic techniques showed higher rates of acceptance than advanced techniques. ©2013 Special Care Dentistry Association and Wiley Periodicals, Inc.
Network meta-analysis: a technique to gather evidence from direct and indirect comparisons
2017-01-01
Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228
Nuvvula, S; Alahari, S; Kamatham, R; Challa, R R
2015-02-01
To determine the effect of three-dimensional (3D) audiovisual (AV) distraction in reducing dental anxiety of children. A randomised clinical trial with a parallel design carried out on 90 children (49 boys and 41 girls) aged between 7 and 10 years (mean age of 8.4 years) to ascertain the comparative efficacy of audio (music) and AV (3D video glasses) distraction in reducing the dental anxiety of children during local analgesia (LA) administration. Ninety children were randomly divided into three groups; control (basic behaviour guidance techniques without distraction), audio (basic techniques plus music) and AV (basic techniques plus 3D AV) distraction groups. All the children experienced LA administration with/without distraction and the anxiety was assessed using a combination of measures: MCDAS(f) (self-report), pulse rate (physiological), behaviour (using Wright's modification of Frankl behaviour rating scale and Houpt scale) and preferences of children. All 90 children completed the study. A highly significant reduction in the anxiety of audiovisual group as reported by the MCDAS(f) values (p<0.001) and Houpt scale (p=0.003); whereas pulse rate showed statistically significant increase (p<0.001) in all the three groups irrespective of distraction. The child preferences also affirmed the usage of 3D video glasses. LA administration with music or 3D video glasses distraction had an added advantage in a majority of children with 3D video glasses being superior to music. High levels of satisfaction from children who experienced treatment with 3D video glasses were also observed.
Advanced Bode Plot Techniques for Ultrasonic Transducers
NASA Astrophysics Data System (ADS)
DeAngelis, D. A.; Schulze, G. W.
The Bode plot, displayed as either impedance or admittance versus frequency, is the most basic test used by ultrasonic transducer designers. With simplicity and ease-of-use, Bode plots are ideal for baseline comparisons such as spacing of parasitic modes or impedance, but quite often the subtleties that manifest as poor process control are hard to interpret or are nonexistence. In-process testing of transducers is time consuming for quantifying statistical aberrations, and assessments made indirectly via the workpiece are difficult. This research investigates the use of advanced Bode plot techniques to compare ultrasonic transducers with known "good" and known "bad" process performance, with the goal of a-priori process assessment. These advanced techniques expand from the basic constant voltage versus frequency sweep to include constant current and constant velocity interrogated locally on transducer or tool; they also include up and down directional frequency sweeps to quantify hysteresis effects like jumping and dropping phenomena. The investigation focuses solely on the common PZT8 piezoelectric material used with welding transducers for semiconductor wire bonding. Several metrics are investigated such as impedance, displacement/current gain, velocity/current gain, displacement/voltage gain and velocity/voltage gain. The experimental and theoretical research methods include Bode plots, admittance loops, laser vibrometry and coupled-field finite element analysis.
Investigation of advanced phase-shifting projected fringe profilometry techniques
NASA Astrophysics Data System (ADS)
Liu, Hongyu
1999-11-01
The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.
Detailed gravity anomalies from GEOS-3 satellite altimetry data
NASA Technical Reports Server (NTRS)
Gopalapillai, G. S.; Mourad, A. G.
1978-01-01
A technique for deriving mean gravity anomalies from dense altimetry data was developed. A combination of both deterministic and statistical techniques was used. The basic mathematical model was based on the Stokes' equation which describes the analytical relationship between mean gravity anomalies and geoid undulations at a point; this undulation is a linear function of the altimetry data at that point. The overdetermined problem resulting from the excessive altimetry data available was solved using Least-Squares principles. These principles enable the simultaneous estimation of the associated standard deviations reflecting the internal consistency based on the accuracy estimates provided for the altimetry data as well as for the terrestrial anomaly data. Several test computations were made of the anomalies and their accuracy estimates using GOES-3 data.
ERIC Educational Resources Information Center
Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca
2016-01-01
Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…
Garg, Rakesh
2016-09-01
The conduct of research requires a systematic approach involving diligent planning and its execution as planned. It comprises various essential predefined components such as aims, population, conduct/technique, outcome and statistical considerations. These need to be objective, reliable and in a repeatable format. Hence, the understanding of the basic aspects of methodology is essential for any researcher. This is a narrative review and focuses on various aspects of the methodology for conduct of a clinical research. The relevant keywords were used for literature search from various databases and from bibliographies of the articles.
James, Peter; Marko-Varga, György A
2011-08-05
One of the most critical functions of the various Proteomics organizations is the training of young scientists and the dissemination of information to the general scientific community. The education committees of the Human Proteome Organisation (HUPO) and the European Proteomics Association (EuPA) together with the other local proteomics associations are therefore launching a joint Tutorial Program to meet these needs. The level is aimed at Masters/PhD level students with good basic training in biology, biochemistry, mathematics and statistics. The Tutorials will consist of a review/teaching article with an accompanying talk slide presentation for classroom teaching. The Tutorial Program will cover core techniques and basics as an introduction to scientists new to the field. The entire series of articles and slides will be made freely available for teaching use at the Journals and Organizations homepages.
Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria
2009-09-01
Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.
Gelau, Christhard; Henning, Matthias J; Krems, Josef F
2009-03-01
In recent years considerable efforts have been spent on the development of the occlusion technique as a procedure for the assessment of the human-machine interface of in-vehicle information and communication systems (IVIS) designed to be used by the driver while driving. The importance and significance of the findings resulting from the application of this procedure depends essentially on its reliability. Because there is a lack of evidence as to whether this basic criterion of measurement is met with this procedure, and because questionable reliability can lead to doubts about their validity, our project strove to clarify this issue. This paper reports on a statistical reanalysis of data obtained from previous experiments. To summarise, the characteristic values found for internal consistency were almost all in the range of .90 for the occlusion technique, which can be considered satisfactory.
Double-row vs single-row rotator cuff repair: a review of the biomechanical evidence.
Wall, Lindley B; Keener, Jay D; Brophy, Robert H
2009-01-01
A review of the current literature will show a difference between the biomechanical properties of double-row and single-row rotator cuff repairs. Rotator cuff tears commonly necessitate surgical repair; however, the optimal technique for repair continues to be investigated. Recently, double-row repairs have been considered an alternative to single-row repair, allowing a greater coverage area for healing and a possibly stronger repair. We reviewed the literature of all biomechanical studies comparing double-row vs single-row repair techniques. Inclusion criteria included studies using cadaveric, animal, or human models that directly compared double-row vs single-row repair techniques, written in the English language, and published in peer reviewed journals. Identified articles were reviewed to provide a comprehensive conclusion of the biomechanical strength and integrity of the repair techniques. Fifteen studies were identified and reviewed. Nine studies showed a statistically significant advantage to a double-row repair with regards to biomechanical strength, failure, and gap formation. Three studies produced results that did not show any statistical advantage. Five studies that directly compared footprint reconstruction all demonstrated that the double-row repair was superior to a single-row repair in restoring anatomy. The current literature reveals that the biomechanical properties of a double-row rotator cuff repair are superior to a single-row repair. Basic Science Study, SRH = Single vs. Double Row RCR.
Techniques for generation of control and guidance signals derived from optical fields, part 2
NASA Technical Reports Server (NTRS)
Hemami, H.; Mcghee, R. B.; Gardner, S. R.
1971-01-01
The development is reported of a high resolution technique for the detection and identification of landmarks from spacecraft optical fields. By making use of nonlinear regression analysis, a method is presented whereby a sequence of synthetic images produced by a digital computer can be automatically adjusted to provide a least squares approximation to a real image. The convergence of the method is demonstrated by means of a computer simulation for both elliptical and rectangular patterns. Statistical simulation studies with elliptical and rectangular patterns show that the computational techniques developed are able to at least match human pattern recognition capabilities, even in the presence of large amounts of noise. Unlike most pattern recognition techniques, this ability is unaffected by arbitrary pattern rotation, translation, and scale change. Further development of the basic approach may eventually allow a spacecraft or robot vehicle to be provided with an ability to very accurately determine its spatial relationship to arbitrary known objects within its optical field of view.
NASA Astrophysics Data System (ADS)
Svirina, Anna; Shindor, Olga; Tatmyshevsky, Konstantin
2014-12-01
The paper deals with the main problems of Russian energy system development that proves necessary to provide educational programs in the field of renewable and alternative energy. In the paper the process of curricula development and defining teaching techniques on the basis of expert opinion evaluation is defined, and the competence model for renewable and alternative energy processing master students is suggested. On the basis of a distributed questionnaire and in-depth interviews, the data for statistical analysis was obtained. On the basis of this data, an optimization of curricula structure was performed, and three models of a structure for optimizing teaching techniques were developed. The suggested educational program structure which was adopted by employers is presented in the paper. The findings include quantitatively estimated importance of systemic thinking and professional skills and knowledge as basic competences of a masters' program graduate; statistically estimated necessity of practice-based learning approach; and optimization models for structuring curricula in renewable and alternative energy processing. These findings allow the establishment of a platform for the development of educational programs.
Analysis of Acoustic Emission Parameters from Corrosion of AST Bottom Plate in Field Testing
NASA Astrophysics Data System (ADS)
Jomdecha, C.; Jirarungsatian, C.; Suwansin, W.
Field testing of aboveground storage tank (AST) to monitor corrosion of the bottom plate is presented in this chapter. AE testing data of the ten AST with different sizes, materials, and products were employed to monitor the bottom plate condition. AE sensors of 30 and 150 kHz were used to monitor the corrosion activity of up to 24 channels including guard sensors. Acoustic emission (AE) parameters were analyzed to explore the AE parameter patterns of occurring corrosion compared to the laboratory results. Amplitude, count, duration, and energy were main parameters of analysis. Pattern recognition technique with statistical was implemented to eliminate the electrical and environmental noises. The results showed the specific AE patterns of corrosion activities related to the empirical results. In addition, plane algorithm was utilized to locate the significant AE events from corrosion. Both results of parameter patterns and AE event locations can be used to interpret and locate the corrosion activities. Finally, basic statistical grading technique was used to evaluate the bottom plate condition of the AST.
Importance of nasal clipping in screening investigations of flow volume curve.
Yanev, I
1992-01-01
Comparative analysis of some basic lung indices obtained from a screening investigation of the flow volume curve by using two techniques, with a nose clip and without a nose clip, was made on a cohort of 86 workers in a factory shop for the production of bearings. We found no statistically significant differences between the indices obtained by the two techniques. Our study showed that the FVC and FEV1 obtained in workers without using nose clips were equal to or better than those obtained using nose clips in 60% of the workers. The reproducibility of the two methods was similar. The analysis of the data has shown that the flow volume curve investigation gives better results when performed without a nose clip, especially in industrial conditions.
Variational Bayesian Parameter Estimation Techniques for the General Linear Model
Starke, Ludger; Ostwald, Dirk
2017-01-01
Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572
Attitude of teaching faculty towards statistics at a medical university in Karachi, Pakistan.
Khan, Nazeer; Mumtaz, Yasmin
2009-01-01
Statistics is mainly used in biological research to verify the clinicians and researchers findings and feelings, and gives scientific validity for their inferences. In Pakistan, the educational curriculum is developed in such a way that the students who are interested in entering in the field of biological sciences do not study mathematics after grade 10. Therefore, due to their fragile background of mathematical skills, the Pakistani medical professionals feel that they do not have adequate base to understand the basic concepts of statistical techniques when they try to use it in their research or read a scientific article. The aim of the study was to assess the attitude of medical faculty towards statistics. A questionnaire containing 42 close-ended and 4 open-ended questions, related to the attitude and knowledge of statistics, was distributed among the teaching faculty of Dow University of Health Sciences (DUHS). One hundred and sixty-seven filled questionnaires were returned from 374 faculty members (response rate 44.7%). Forty-three percent of the respondents claimed that they had 'introductive' level of statistics courses, 63% of the respondents strongly agreed that a good researcher must have some training in statistics, 82% of the faculty was in favour (strongly agreed or agreed) that statistics was really useful for research. Only 17% correctly stated that statistics is the science of uncertainty. Half of the respondents accepted that they have problem of writing the statistical section of the article. 64% of the subjects indicated that statistical teaching methods were the main reasons for the impression of its difficulties. 53% of the faculty indicated that the co-authorship of the statistician should depend upon his/her contribution in the study. Gender did not show any significant difference among the responses. However, senior faculty showed higher level of the importance for the use of statistics and difficulties of writing result section of articles as compared to junior faculty. The study showed a low level of knowledge, but high level of the awareness for the use of statistical techniques in research and exhibited a good level of motivation for further training.
Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung
2014-10-01
Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.
Modeling a maintenance simulation of the geosynchronous platform
NASA Technical Reports Server (NTRS)
Kleiner, A. F., Jr.
1980-01-01
A modeling technique used to conduct a simulation study comparing various maintenance routines for a space platform is dicussed. A system model is described and illustrated, the basic concepts of a simulation pass are detailed, and sections on failures and maintenance are included. The operation of the system across time is best modeled by a discrete event approach with two basic events - failure and maintenance of the system. Each overall simulation run consists of introducing a particular model of the physical system, together with a maintenance policy, demand function, and mission lifetime. The system is then run through many passes, each pass corresponding to one mission and the model is re-initialized before each pass. Statistics are compiled at the end of each pass and after the last pass a report is printed. Items of interest typically include the time to first maintenance, total number of maintenance trips for each pass, average capability of the system, etc.
An uncertainty analysis of the flood-stage upstream from a bridge.
Sowiński, M
2006-01-01
The paper begins with the formulation of the problem in the form of a general performance function. Next the Latin hypercube sampling (LHS) technique--a modified version of the Monte Carlo method is briefly described. The essential uncertainty analysis of the flood-stage upstream from a bridge starts with a description of the hydraulic model. This model concept is based on the HEC-RAS model developed for subcritical flow under a bridge without piers in which the energy equation is applied. The next section contains the characteristic of the basic variables including a specification of their statistics (means and variances). Next the problem of correlated variables is discussed and assumptions concerning correlation among basic variables are formulated. The analysis of results is based on LHS ranking lists obtained from the computer package UNCSAM. Results fot two examples are given: one for independent and the other for correlated variables.
Interpretation of the results of statistical measurements. [search for basic probability model
NASA Technical Reports Server (NTRS)
Olshevskiy, V. V.
1973-01-01
For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.
Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira
2014-08-01
This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Defining the best quality-control systems by design and inspection.
Hinckley, C M
1997-05-01
Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.
Utilization of satellite data for inventorying prairie ponds and lakes
NASA Technical Reports Server (NTRS)
Work, E. A., Jr.; Gilmer, D. S.
1976-01-01
ERTS-1 data were used in mapping open surface water features in the glaciated prairies. Emphasis was placed on the recognition of these features based upon water's uniquely low radiance in a single near-infrared waveband. On the basis of these results, thematic maps and statistics relating to open surface water were obtained. In a related effort, the added information content of multiple spectral wavebands was used for discriminating surface water at a level of detail finer than the virtual resolution of the data. The basic theory of this technique and some preliminary results are described.
ERIC Educational Resources Information Center
Zetterqvist, Lena
2017-01-01
Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…
A basic introduction to statistics for the orthopaedic surgeon.
Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef
2012-02-01
Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values.
Validation and Improvement of SRTM Performance over Rugged Terrain
NASA Technical Reports Server (NTRS)
Zebker, Howard A.
2004-01-01
We have previously reported work related to basic technique development in phase unwrapping and generation of digital elevation models (DEM). In the final year of this work we have applied our technique work to the improvement of DEM's produced by SRTM. In particular, we have developed a rigorous mathematical algorithm and means to fill in missing data over rough terrain from other data sets. We illustrate this method by using a higher resolution, but globally less accurate, DEM produced by the TOPSAR airborne instrument over the Galapagos Islands to augment the SRTM data set in this area, We combine this data set with SRTM to use each set to fill in holes left over by the other imaging system. The infilling is done by first interpolating each data set using a prediction error filter that reproduces the same statistical characterization as exhibited by the entire data set within the interpolated region. After this procedure is implemented on each data set, the two are combined on a point by point basis with weights that reflect the accuracy of each data point in its original image. In areas that are better covered by SRTM, TOPSAR data are weighted down but still retain TOPSAR statistics. The reverse is true for regions better covered by TOPSAR. The resulting DEM passes statistical tests and appears quite feasible to the eye, but as this DEM is the best available for the region we cannot fully veri@ its accuracy. Spot checks with GPS points show that locally the technique results in a more comprehensive and accurate map than either data set alone.
Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation
NASA Astrophysics Data System (ADS)
Tchiguirinskaia, Ioulia; Scherzer, Daniel
2016-04-01
Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate a cyclone with the help of a 3D ordinary differential system. Being well supported by the extensive numerical results, the cascade generalisation of Euler's gyroscope equation opens new horizons for predictability and predictions of processes having long-range dependences.
Using Data Mining to Teach Applied Statistics and Correlation
ERIC Educational Resources Information Center
Hartnett, Jessica L.
2016-01-01
This article describes two class activities that introduce the concept of data mining and very basic data mining analyses. Assessment data suggest that students learned some of the conceptual basics of data mining, understood some of the ethical concerns related to the practice, and were able to perform correlations via the Statistical Package for…
Simple Data Sets for Distinct Basic Summary Statistics
ERIC Educational Resources Information Center
Lesser, Lawrence M.
2011-01-01
It is important to avoid ambiguity with numbers because unfortunate choices of numbers can inadvertently make it possible for students to form misconceptions or make it difficult for teachers to tell if students obtained the right answer for the right reason. Therefore, it is important to make sure when introducing basic summary statistics that…
Using Shakespeare's Sotto Voce to Determine True Identity From Text
Kernot, David; Bossomaier, Terry; Bradbury, Roger
2018-01-01
Little is known of the private life of William Shakespeare, but he is famous for his collection of plays and poems, even though many of the works attributed to him were published anonymously. Determining the identity of Shakespeare has fascinated scholars for 400 years, and four significant figures in English literary history have been suggested as likely alternatives to Shakespeare for some disputed works: Bacon, de Vere, Stanley, and Marlowe. A myriad of computational and statistical tools and techniques have been used to determine the true authorship of his works. Many of these techniques rely on basic statistical correlations, word counts, collocated word groups, or keyword density, but no one method has been decided on. We suggest that an alternative technique that uses word semantics to draw on personality can provide an accurate profile of a person. To test this claim, we analyse the works of Shakespeare, Christopher Marlowe, and Elizabeth Cary. We use Word Accumulation Curves, Hierarchical Clustering overlays, Principal Component Analysis, and Linear Discriminant Analysis techniques in combination with RPAS, a multi-faceted text analysis approach that draws on a writer's personality, or self to identify subtle characteristics within a person's writing style. Here we find that RPAS can separate the known authored works of Shakespeare from Marlowe and Cary. Further, it separates their contested works, works suspected of being written by others. While few authorship identification techniques identify self from the way a person writes, we demonstrate that these stylistic characteristics are as applicable 400 years ago as they are today and have the potential to be used within cyberspace for law enforcement purposes. PMID:29599734
Estimating the number of animals in wildlife populations
Lancia, R.A.; Kendall, W.L.; Pollock, K.H.; Nichols, J.D.; Braun, Clait E.
2005-01-01
INTRODUCTION In 1938, Howard M. Wight devoted 9 pages, which was an entire chapter in the first wildlife management techniques manual, to what he termed 'census' methods. As books and chapters such as this attest, the volume of literature on this subject has grown tremendously. Abundance estimation remains an active area of biometrical research, as reflected in the many differences between this chapter and the similar contribution in the previous manual. Our intent in this chapter is to present an overview of the basic and most widely used population estimation techniques and to provide an entree to the relevant literature. Several possible approaches could be taken in writing a chapter dealing with population estimation. For example, we could provide a detailed treatment focusing on statistical models and on derivation of estimators based on these models. Although a chapter using this approach might provide a valuable reference for quantitative biologists and biometricians, it would be of limited use to many field biologists and wildlife managers. Another approach would be to focus on details of actually applying different population estimation techniques. This approach would include both field application (e.g., how to set out a trapping grid or conduct an aerial survey) and detailed instructions on how to use the resulting data with appropriate estimation equations. We are reluctant to attempt such an approach, however, because of the tremendous diversity of real-world field situations defined by factors such as the animal being studied, habitat, available resources, and because of our resultant inability to provide detailed instructions for all possible cases. We believe it is more useful to provide the reader with the conceptual basis underlying estimation methods. Thus, we have tried to provide intuitive explanations for how basic methods work. In doing so, we present relevant estimation equations for many methods and provide citations of more detailed treatments covering both statistical considerations and field applications. We have chosen to present methods that are representative of classes of estimators, rather than address every available method. Our hope is that this chapter will provide the reader with enough background to make an informed decision about what general method(s) will likely perform well in any particular field situation. Readers with a more quantitative background may then be able to consult detailed references and tailor the selected method to suit their particular needs. Less quantitative readers should consult a biometrician, preferably one with experience in wildlife studies, for this 'tailoring,' with the hope they will be able to do so with a basic understanding of the general method, thereby permitting useful interaction and discussion with the biometrician. SUMMARY Estimating the abundance or density of animals in wild populations is not a trivial matter. Virtually all techniques involve the basic problem of estimating the probability of seeing, capturing, or otherwise detecting animals during some type of survey and, in many cases, sampling concerns as well. In the case of indices, the detection probability is assumed to be constant (but unknown). We caution against use of indices unless this assumption can be verified for the comparison(s) of interest. In the case of population estimation, many methods have been developed over the years to estimate the probability of detection associated with various kinds of count statistics. Techniques range from complete counts, where sampling concerns often dominate, to incomplete counts where detection probabilities are also important. Some examples of the latter are multiple observers, removal methods, and capture-recapture. Before embarking on a survey to estimate the size of a population, one must understand clearly what information is needed and for what purpose the information will be used. The key to derivin
Graham, Daniel J; Field, David J
2008-01-01
Two recent studies suggest that natural scenes and paintings show similar statistical properties. But does the content or region of origin of an artwork affect its statistical properties? We addressed this question by having judges place paintings from a large, diverse collection of paintings into one of three subject-matter categories using a forced-choice paradigm. Basic statistics for images whose caterogization was agreed by all judges showed no significant differences between those judged to be 'landscape' and 'portrait/still-life', but these two classes differed from paintings judged to be 'abstract'. All categories showed basic spatial statistical regularities similar to those typical of natural scenes. A test of the full painting collection (140 images) with respect to the works' place of origin (provenance) showed significant differences between Eastern works and Western ones, differences which we find are likely related to the materials and the choice of background color. Although artists deviate slightly from reproducing natural statistics in abstract art (compared to representational art), the great majority of human art likely shares basic statistical limitations. We argue that statistical regularities in art are rooted in the need to make art visible to the eye, not in the inherent aesthetic value of natural-scene statistics, and we suggest that variability in spatial statistics may be generally imposed by manufacture.
NASA Technical Reports Server (NTRS)
Bowles, Roland L.; Buck, Bill K.
2009-01-01
The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.
NASA Astrophysics Data System (ADS)
Papadavid, G.; Hadjimitsis, D.
2014-08-01
Remote sensing techniques development have provided the opportunity for optimizing yields in the agricultural procedure and moreover to predict the forthcoming yield. Yield prediction plays a vital role in Agricultural Policy and provides useful data to policy makers. In this context, crop and soil parameters along with NDVI index which are valuable sources of information have been elaborated statistically to test if a) Durum wheat yield can be predicted and b) when is the actual time-window to predict the yield in the district of Paphos, where Durum wheat is the basic cultivation and supports the rural economy of the area. 15 plots cultivated with Durum wheat from the Agricultural Research Institute of Cyprus for research purposes, in the area of interest, have been under observation for three years to derive the necessary data. Statistical and remote sensing techniques were then applied to derive and map a model that can predict yield of Durum wheat in this area. Indeed the semi-empirical model developed for this purpose, with very high correlation coefficient R2=0.886, has shown in practice that can predict yields very good. Students T test has revealed that predicted values and real values of yield have no statistically significant difference. The developed model can and will be further elaborated with more parameters and applied for other crops in the near future.
Basic statistics (the fundamental concepts).
Lim, Eric
2014-12-01
An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…
From Research to Practice: Basic Mathematics Skills and Success in Introductory Statistics
ERIC Educational Resources Information Center
Lunsford, M. Leigh; Poplin, Phillip
2011-01-01
Based on previous research of Johnson and Kuennen (2006), we conducted a study to determine factors that would possibly predict student success in an introductory statistics course. Our results were similar to Johnson and Kuennen in that we found students' basic mathematical skills, as measured on a test created by Johnson and Kuennen, were a…
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
A Monte Carlo Simulation Study of the Reliability of Intraindividual Variability
Estabrook, Ryne; Grimm, Kevin J.; Bowles, Ryan P.
2012-01-01
Recent research has seen intraindividual variability (IIV) become a useful technique to incorporate trial-to-trial variability into many types of psychological studies. IIV as measured by individual standard deviations (ISDs) has shown unique prediction to several types of positive and negative outcomes (Ram, Rabbit, Stollery, & Nesselroade, 2005). One unanswered question regarding measuring intraindividual variability is its reliability and the conditions under which optimal reliability is achieved. Monte Carlo simulation studies were conducted to determine the reliability of the ISD compared to the intraindividual mean. The results indicate that ISDs generally have poor reliability and are sensitive to insufficient measurement occasions, poor test reliability, and unfavorable amounts and distributions of variability in the population. Secondary analysis of psychological data shows that use of individual standard deviations in unfavorable conditions leads to a marked reduction in statistical power, although careful adherence to underlying statistical assumptions allows their use as a basic research tool. PMID:22268793
Experiments on Adaptive Techniques for Host-Based Intrusion Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.
2001-09-01
This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerablemore » preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.« less
A persuasive concept of research-oriented teaching in Soil Biochemistry
NASA Astrophysics Data System (ADS)
Blagodatskaya, Evgenia; Kuzyakova, Irina
2013-04-01
One of the main problems of existing bachelor programs is disconnection of basic and experimental education: even during practical training the methods learned are not related to characterization of soil field experiments and observed soil processes. We introduce a multi-level research-oriented teaching system involving Bachelor students in four-semesters active study by integration the basic knowledge, experimental techniques, statistical approaches, project design and it's realization.The novelty of research-oriented teaching system is based 1) on linkage of ongoing experiment to the study of statistical methods and 2) on self-responsibility of students for interpretation of soil chemical and biochemical characteristics obtained in the very beginning of their study by analysing the set of soil samples allowing full-factorial data treatment. This experimental data set is related to specific soil stand and is used as a backbone of the teaching system accelerating the student's interest to soil studies and motivating them for application of basic knowledge from lecture courses. The multi-level system includes: 1) basic lecture course on soil biochemistry with analysis of research questions, 2) practical training course on laboratory analytics where small groups of students are responsible for analysis of soil samples related to the specific land-use/forest type/forest age; 3) training course on biotic (e.g. respiration) - abiotic (e.g. temperature, moisture, fire etc.) interactions in the same soil samples; 4) theoretical seminars where students present and make a first attempt to explain soil characteristics of various soil stands as affected by abiotic factors (first semester); 5) lecture and seminar course on soil statistics where students apply newly learned statistical methods to prove their conclusions and to find relationships between soil characteristics obtained during first semester; 6) seminar course on project design where students develop their scientific projects to study the uncertainties revealed in soil responses to abiotic factors (second and third semesters); 7) Lecture, seminar and training courses on estimation of active microbial biomass in soil where students realize their projects applying a new knowledge to the soils from the stands they are responsible for (fourth semester). Thus, during four semesters the students continuously combine the theoretical knowledge from the lectures with their own experimental experience, compare and discuss results of various groups during seminars and obtain the skills in project design. The successful application of research-oriented teaching system in University of Göttingen allowed each student the early-stage revealing knowledge gaps, accelerated their involvement in ongoing research projects, and motivated them to begin own scientific career.
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
Awareness of basic life support among dental practitioners.
Baduni, Neha; Prakash, Prem; Srivastava, Dhirendra; Sanwal, Manoj Kumar; Singh, Bijender Pal
2014-01-01
It is important that every member of our community should be trained in effective BLS technique to save lives. At least doctors including dental practitioners, and medical and paramedical staff should be trained in high quality CPR, as it is a basic medical skill which can save many lives if implemented timely. Our aim was to study the awareness of Basic Life Support (BLS) among dental students and practitioners in New Delhi. This cross sectional study was conducted by assessing responses to 20 selected questions pertaining to BLS among dental students, resident doctors/tutors, faculty members and private practitioners in New Delhi. All participants were given a printed questionnaire where they had to mention their qualifications and clinical experience, apart from answering 20 questions. Data was collected and evaluated using commercially available statistical package for social sciences (SPSS version 12). One hundred and four responders were included. Sadly, none of our responders had complete knowledge about BLS. The maximum mean score (9.19 ± 1.23) was obtained by dentists with clinical experience between 1-5 years. To ensure better and safer healthcare, it is essential for all dental practitioners to be well versed with BLS.
Quantitative Hyperspectral Reflectance Imaging
Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.
2008-01-01
Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms. PMID:27873831
AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*
Bruch, Elizabeth; Atwell, Jon
2014-01-01
Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator)
1980-01-01
Several possibilities were considered for defining the data set in which the same test areas could be used for each of the four different spatial resolutions being evaluated. The LARSYS CLUSTER was used to sort the vectors into spectral classes to reduce the within-spectral class variability in an effort to develop training statistics. A data quality test was written to determine the basic signal to noise characteristics within the data set being used. Because preliminary analysis of the LANDSAT MSS data revealed the presence of high cirrus clouds, other data sets are being sought.
General statistical considerations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eberhardt, L L; Gilbert, R O
From NAEG plutonium environmental studies program meeting; Las Vegas, Nevada, USA (2 Oct 1973). The high sampling variability encountered in environmental plutonium studies along with high analytical costs makes it very important that efficient soil sampling plans be used. However, efficient sampling depends on explicit and simple statements of the objectives of the study. When there are multiple objectives it may be difficult to devise a wholly suitable sampling scheme. Sampling for long-term changes in plutonium concentration in soils may also be complex and expensive. Further attention to problems associated with compositing samples is recommended, as is the consistent usemore » of random sampling as a basic technique. (auth)« less
Rapid Vision Correction by Special Operations Forces.
Reynolds, Mark E
This report describes a rapid method of vision correction used by Special Operations Medics in multiple operational engagements. Between 2011 and 2015, Special Operations Medics used an algorithm- driven refraction technique. A standard block of instruction was provided to the medics, along with a packaged kit. The technique was used in multiple operational engagements with host nation military and civilians. Data collected for program evaluation were later analyzed to assess the utility of the technique. Glasses were distributed to 230 patients with complaints of either decreased distance or near (reading). Most patients (84%) with distance complaints achieved corrected binocular vision of 20/40 or better, and 97% of patients with near-vision complaints achieved corrected near-binocular vision of 20/40 or better. There was no statistically significant difference between the percentages of patients achieving 20/40 when medics used the technique under direct supervision versus independent use. A basic refraction technique using a designed kit allows for meaningful improvement in distance and/or near vision at austere locations. Special Operations Medics can leverage this approach after specific training with minimal time commitment. It can serve as a rapid, effective intervention with multiple applications in diverse operational environments. 2017.
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Cavalot, A L; Palonta, F; Preti, G; Nazionale, G; Ricci, E; Vione, N; Albera, R; Cortesina, G
2001-12-01
The insertion of a prosthesis and restoration with pectoralis major myocutaneous flaps for patients subjected to total pharyngolaryngectomy is a technique now universally accepted; however the literature on the subject is lacking. Our study considers 10 patients subjected to total pharyngolaryngectomy and restoration with pectoralis major myocutaneous flaps who were fitted with vocal function prostheses and a control group of 50 subjects treated with a total laryngectomy without pectoralis major myocutaneous flaps and who were fitted with vocal function prostheses. Specific qualitative and quantitative parameters were compared. The quantitative measurement of the levels of voice intensity and the evaluation of the harmonics-to-noise ratio were not statistically significant (p > 0.05) between the two study groups at either high- or low-volume speech. On the contrary, statistically significant differences were found (p < 0.05) for the basic frequency of both the low and the high volume voice. For the qualitative analysis seven parameters were established for evaluation by trained and untrained listeners: on the basis of these parameters the control group had statistically better voices.
Regression modeling of ground-water flow
Cooley, R.L.; Naff, R.L.
1985-01-01
Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)
Haircutting Guide for Cosmetology Students.
ERIC Educational Resources Information Center
Baker, Linda M.
Intended for use at any point in a beauty culture course, this student manual on haircutting implements and techniques focuses on two basic haircuts--page and short summer cut--to describe and illustrate basic cutting and shaping techniques. There are four major sections in the manual: (1) Hairshaping Implements and Techniques (Implements Used In…
Applications of statistics to medical science (1) Fundamental concepts.
Watanabe, Hiroshi
2011-01-01
The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.
Practical Computer Security through Cryptography
NASA Technical Reports Server (NTRS)
McNab, David; Twetev, David (Technical Monitor)
1998-01-01
The core protocols upon which the Internet was built are insecure. Weak authentication and the lack of low level encryption services introduce vulnerabilities that propagate upwards in the network stack. Using statistics based on CERT/CC Internet security incident reports, the relative likelihood of attacks via these vulnerabilities is analyzed. The primary conclusion is that the standard UNIX BSD-based authentication system is by far the most commonly exploited weakness. Encryption of Sensitive password data and the adoption of cryptographically-based authentication protocols can greatly reduce these vulnerabilities. Basic cryptographic terminology and techniques are presented, with attention focused on the ways in which technology such as encryption and digital signatures can be used to protect against the most commonly exploited vulnerabilities. A survey of contemporary security software demonstrates that tools based on cryptographic techniques, such as Kerberos, ssh, and PGP, are readily available and effectively close many of the most serious security holes. Nine practical recommendations for improving security are described.
NASA Astrophysics Data System (ADS)
Gbedo, Yémalin Gabin; Mangin-Brinet, Mariane
2017-07-01
We present a new procedure to determine parton distribution functions (PDFs), based on Markov chain Monte Carlo (MCMC) methods. The aim of this paper is to show that we can replace the standard χ2 minimization by procedures grounded on statistical methods, and on Bayesian inference in particular, thus offering additional insight into the rich field of PDFs determination. After a basic introduction to these techniques, we introduce the algorithm we have chosen to implement—namely Hybrid (or Hamiltonian) Monte Carlo. This algorithm, initially developed for Lattice QCD, turns out to be very interesting when applied to PDFs determination by global analyses; we show that it allows us to circumvent the difficulties due to the high dimensionality of the problem, in particular concerning the acceptance. A first feasibility study is performed and presented, which indicates that Markov chain Monte Carlo can successfully be applied to the extraction of PDFs and of their uncertainties.
Four-Dimensional Golden Search
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenimore, Edward E.
2015-02-25
The Golden search technique is a method to search a multiple-dimension space to find the minimum. It basically subdivides the possible ranges of parameters until it brackets, to within an arbitrarily small distance, the minimum. It has the advantages that (1) the function to be minimized can be non-linear, (2) it does not require derivatives of the function, (3) the convergence criterion does not depend on the magnitude of the function. Thus, if the function is a goodness of fit parameter such as chi-square, the convergence does not depend on the noise being correctly estimated or the function correctly followingmore » the chi-square statistic. And, (4) the convergence criterion does not depend on the shape of the function. Thus, long shallow surfaces can be searched without the problem of premature convergence. As with many methods, the Golden search technique can be confused by surfaces with multiple minima.« less
ERIC Educational Resources Information Center
Ragasa, Carmelita Y.
2008-01-01
The objective of the study is to determine if there is a significant difference in the effects of the treatment and control groups on achievement as well as on attitude as measured by the posttest. A class of 38 sophomore college students in the basic statistics taught with the use of computer-assisted instruction and another class of 15 students…
Back to basics: an introduction to statistics.
Halfens, R J G; Meijers, J M M
2013-05-01
In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.
Factors related to student performance in statistics courses in Lebanon
NASA Astrophysics Data System (ADS)
Naccache, Hiba Salim
The purpose of the present study was to identify factors that may contribute to business students in Lebanese universities having difficulty in introductory and advanced statistics courses. Two statistics courses are required for business majors at Lebanese universities. Students are not obliged to be enrolled in any math courses prior to taking statistics courses. Drawing on recent educational research, this dissertation attempted to identify the relationship between (1) students’ scores on Lebanese university math admissions tests; (2) students’ scores on a test of very basic mathematical concepts; (3) students’ scores on the survey of attitude toward statistics (SATS); (4) course performance as measured by students’ final scores in the course; and (5) their scores on the final exam. Data were collected from 561 students enrolled in multiple sections of two courses: 307 students in the introductory statistics course and 260 in the advanced statistics course in seven campuses across Lebanon over one semester. The multiple regressions results revealed four significant relationships at the introductory level: between students’ scores on the math quiz with their (1) final exam scores; (2) their final averages; (3) the Cognitive subscale of the SATS with their final exam scores; and (4) their final averages. These four significant relationships were also found at the advanced level. In addition, two more significant relationships were found between students’ final average and the two subscales of Effort (5) and Affect (6). No relationship was found between students’ scores on the admission math tests and both their final exam scores and their final averages in both the introductory and advanced level courses. On the other hand, there was no relationship between students’ scores on Lebanese admissions tests and their final achievement. Although these results were consistent across course formats and instructors, they may encourage Lebanese universities to assess the effectiveness of prerequisite math courses. Moreover, these findings may lead the Lebanese Ministry of Education to make changes to the admissions exams, course prerequisites, and course content. Finally, to enhance the attitude of students, new learning techniques, such as group work during class meetings can be helpful, and future research should aim to test the effectiveness of these pedagogical techniques on students’ attitudes toward statistics.
A Simulation of AI Programming Techniques in BASIC.
ERIC Educational Resources Information Center
Mandell, Alan
1986-01-01
Explains the functions of and the techniques employed in expert systems. Offers the program "The Periodic Table Expert," as a model for using artificial intelligence techniques in BASIC. Includes the program listing and directions for its use on: Tandy 1000, 1200, and 2000; IBM PC; PC Jr; TRS-80; and Apple computers. (ML)
ERIC Educational Resources Information Center
Demaray, Bryan
Five packets comprise the marine science component of an enrichment program for gifted elementary students. Considered in the introductory section are identification (pre/post measure) procedures. Remaining packets address the following topics (subtopics in parentheses): basic marine science laboratory techniques (microscope techniques and metric…
Selection vector filter framework
NASA Astrophysics Data System (ADS)
Lukac, Rastislav; Plataniotis, Konstantinos N.; Smolka, Bogdan; Venetsanopoulos, Anastasios N.
2003-10-01
We provide a unified framework of nonlinear vector techniques outputting the lowest ranked vector. The proposed framework constitutes a generalized filter class for multichannel signal processing. A new class of nonlinear selection filters are based on the robust order-statistic theory and the minimization of the weighted distance function to other input samples. The proposed method can be designed to perform a variety of filtering operations including previously developed filtering techniques such as vector median, basic vector directional filter, directional distance filter, weighted vector median filters and weighted directional filters. A wide range of filtering operations is guaranteed by the filter structure with two independent weight vectors for angular and distance domains of the vector space. In order to adapt the filter parameters to varying signal and noise statistics, we provide also the generalized optimization algorithms taking the advantage of the weighted median filters and the relationship between standard median filter and vector median filter. Thus, we can deal with both statistical and deterministic aspects of the filter design process. It will be shown that the proposed method holds the required properties such as the capability of modelling the underlying system in the application at hand, the robustness with respect to errors in the model of underlying system, the availability of the training procedure and finally, the simplicity of filter representation, analysis, design and implementation. Simulation studies also indicate that the new filters are computationally attractive and have excellent performance in environments corrupted by bit errors and impulsive noise.
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
ERIC Educational Resources Information Center
Haas, Stephanie W.; Pattuelli, Maria Cristina; Brown, Ron T.
2003-01-01
Describes the Statistical Interactive Glossary (SIG), an enhanced glossary of statistical terms supported by the GovStat ontology of statistical concepts. Presents a conceptual framework whose components articulate different aspects of a term's basic explanation that can be manipulated to produce a variety of presentations. The overarching…
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
A Streamflow Statistics (StreamStats) Web Application for Ohio
Koltun, G.F.; Kula, Stephanie P.; Puskas, Barry M.
2006-01-01
A StreamStats Web application was developed for Ohio that implements equations for estimating a variety of streamflow statistics including the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year peak streamflows, mean annual streamflow, mean monthly streamflows, harmonic mean streamflow, and 25th-, 50th-, and 75th-percentile streamflows. StreamStats is a Web-based geographic information system application designed to facilitate the estimation of streamflow statistics at ungaged locations on streams. StreamStats can also serve precomputed streamflow statistics determined from streamflow-gaging station data. The basic structure, use, and limitations of StreamStats are described in this report. To facilitate the level of automation required for Ohio's StreamStats application, the technique used by Koltun (2003)1 for computing main-channel slope was replaced with a new computationally robust technique. The new channel-slope characteristic, referred to as SL10-85, differed from the National Hydrography Data based channel slope values (SL) reported by Koltun (2003)1 by an average of -28.3 percent, with the median change being -13.2 percent. In spite of the differences, the two slope measures are strongly correlated. The change in channel slope values resulting from the change in computational method necessitated revision of the full-model equations for flood-peak discharges originally presented by Koltun (2003)1. Average standard errors of prediction for the revised full-model equations presented in this report increased by a small amount over those reported by Koltun (2003)1, with increases ranging from 0.7 to 0.9 percent. Mean percentage changes in the revised regression and weighted flood-frequency estimates relative to regression and weighted estimates reported by Koltun (2003)1 were small, ranging from -0.72 to -0.25 percent and -0.22 to 0.07 percent, respectively.
NASA Astrophysics Data System (ADS)
Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan
2017-09-01
Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.
Fish: A New Computer Program for Friendly Introductory Statistics Help
ERIC Educational Resources Information Center
Brooks, Gordon P.; Raffle, Holly
2005-01-01
All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…
Effectiveness of the Touch Math Technique in Teaching Basic Addition to Children with Autism
ERIC Educational Resources Information Center
Yikmis, Ahmet
2016-01-01
This study aims to reveal whether the touch math technique is effective in teaching basic addition to children with autism. The dependent variable of this study is the children's skills to solve addition problems correctly, whereas teaching with the touch math technique is the independent variable. Among the single-subject research models, a…
Application of artificial intelligence to the management of urological cancer.
Abbod, Maysam F; Catto, James W F; Linkens, Derek A; Hamdy, Freddie C
2007-10-01
Artificial intelligence techniques, such as artificial neural networks, Bayesian belief networks and neuro-fuzzy modeling systems, are complex mathematical models based on the human neuronal structure and thinking. Such tools are capable of generating data driven models of biological systems without making assumptions based on statistical distributions. A large amount of study has been reported of the use of artificial intelligence in urology. We reviewed the basic concepts behind artificial intelligence techniques and explored the applications of this new dynamic technology in various aspects of urological cancer management. A detailed and systematic review of the literature was performed using the MEDLINE and Inspec databases to discover reports using artificial intelligence in urological cancer. The characteristics of machine learning and their implementation were described and reports of artificial intelligence use in urological cancer were reviewed. While most researchers in this field were found to focus on artificial neural networks to improve the diagnosis, staging and prognostic prediction of urological cancers, some groups are exploring other techniques, such as expert systems and neuro-fuzzy modeling systems. Compared to traditional regression statistics artificial intelligence methods appear to be accurate and more explorative for analyzing large data cohorts. Furthermore, they allow individualized prediction of disease behavior. Each artificial intelligence method has characteristics that make it suitable for different tasks. The lack of transparency of artificial neural networks hinders global scientific community acceptance of this method but this can be overcome by neuro-fuzzy modeling systems.
The structural properties of PbF2 by molecular dynamics
NASA Astrophysics Data System (ADS)
Chergui, Y.; Nehaoua, N.; Telghemti, B.; Guemid, S.; Deraddji, N. E.; Belkhir, H.; Mekki, D. E.
2010-08-01
This work presents the use of molecular dynamics (MD) and the code of Dl_Poly, in order to study the structure of fluoride glass after melting and quenching. We are realized the processing phase liquid-phase, simulating rapid quenching at different speeds to see the effect of quenching rate on the operation of the devitrification. This technique of simulation has become a powerful tool for investigating the microscopic behaviour of matter as well as for calculating macroscopic observable quantities. As basic results, we calculated the interatomic distance, angles and statistics, which help us to know the geometric form and the structure of PbF2. These results are in experimental agreement to those reported in literature.
Correlation Functions and Glass Structure
NASA Astrophysics Data System (ADS)
Chergui, Y.; Nehaoua, N.; Telghemti, B.; Guemid, S.; Deraddji, N. E.; Belkhir, H.; Mekki, D. E.
2011-04-01
This work presents the use of molecular dynamics (MD) and the code of Dl Poly, in order to study the structure of fluoride glass after melting and quenching. We are realized the processing phase liquid-phase, simulating rapid quenching at different speeds to see the effect of quenching rate on the operation of the devitrification. This technique of simulation has become a powerful tool for investigating the microscopic behaviour of matter as well as for calculating macroscopic observable quantities. As basic results, we calculated the interatomic distance, angles and statistics, which help us to know the geometric form and the structure of PbF2. These results are in experimental agreement to those reported in literature.
Comparison of three artificial intelligence techniques for discharge routing
NASA Astrophysics Data System (ADS)
Khatibi, Rahman; Ghorbani, Mohammad Ali; Kashani, Mahsa Hasanpour; Kisi, Ozgur
2011-06-01
SummaryThe inter-comparison of three artificial intelligence (AI) techniques are presented using the results of river flow/stage timeseries, that are otherwise handled by traditional discharge routing techniques. These models comprise Artificial Neural Network (ANN), Adaptive Nero-Fuzzy Inference System (ANFIS) and Genetic Programming (GP), which are for discharge routing of Kizilirmak River, Turkey. The daily mean river discharge data with a period between 1999 and 2003 were used for training and testing the models. The comparison includes both visual and parametric approaches using such statistic as Coefficient of Correlation (CC), Mean Absolute Error (MAE) and Mean Square Relative Error (MSRE), as well as a basic scoring system. Overall, the results indicate that ANN and ANFIS have mixed fortunes in discharge routing, and both have different abilities in capturing and reproducing some of the observed information. However, the performance of GP displays a better edge over the other two modelling approaches in most of the respects. Attention is given to the information contents of recorded timeseries in terms of their peak values and timings, where one performance measure may capture some of the information contents but be ineffective in others. Thus, this makes a case for compiling knowledge base for various modelling techniques.
Virtual lab demonstrations improve students' mastery of basic biology laboratory techniques.
Maldarelli, Grace A; Hartmann, Erica M; Cummings, Patrick J; Horner, Robert D; Obom, Kristina M; Shingles, Richard; Pearlman, Rebecca S
2009-01-01
Biology laboratory classes are designed to teach concepts and techniques through experiential learning. Students who have never performed a technique must be guided through the process, which is often difficult to standardize across multiple lab sections. Visual demonstration of laboratory procedures is a key element in teaching pedagogy. The main goals of the study were to create videos explaining and demonstrating a variety of lab techniques that would serve as teaching tools for undergraduate and graduate lab courses and to assess the impact of these videos on student learning. Demonstrations of individual laboratory procedures were videotaped and then edited with iMovie. Narration for the videos was edited with Audacity. Undergraduate students were surveyed anonymously prior to and following screening to assess the impact of the videos on student lab performance by completion of two Participant Perception Indicator surveys. A total of 203 and 171 students completed the pre- and posttesting surveys, respectively. Statistical analyses were performed to compare student perceptions of knowledge of, confidence in, and experience with the lab techniques before and after viewing the videos. Eleven demonstrations were recorded. Chi-square analysis revealed a significant increase in the number of students reporting increased knowledge of, confidence in, and experience with the lab techniques after viewing the videos. Incorporation of instructional videos as prelaboratory exercises has the potential to standardize techniques and to promote successful experimental outcomes.
Kamath, Padmaja; Fernandez, Alberto; Giralt, Francesc; Rallo, Robert
2015-01-01
Nanoparticles are likely to interact in real-case application scenarios with mixtures of proteins and biomolecules that will absorb onto their surface forming the so-called protein corona. Information related to the composition of the protein corona and net cell association was collected from literature for a library of surface-modified gold and silver nanoparticles. For each protein in the corona, sequence information was extracted and used to calculate physicochemical properties and statistical descriptors. Data cleaning and preprocessing techniques including statistical analysis and feature selection methods were applied to remove highly correlated, redundant and non-significant features. A weighting technique was applied to construct specific signatures that represent the corona composition for each nanoparticle. Using this basic set of protein descriptors, a new Protein Corona Structure-Activity Relationship (PCSAR) that relates net cell association with the physicochemical descriptors of the proteins that form the corona was developed and validated. The features that resulted from the feature selection were in line with already published literature, and the computational model constructed on these features had a good accuracy (R(2)LOO=0.76 and R(2)LMO(25%)=0.72) and stability, with the advantage that the fingerprints based on physicochemical descriptors were independent of the specific proteins that form the corona.
CADDIS Volume 4. Data Analysis: Basic Principles & Issues
Use of inferential statistics in causal analysis, introduction to data independence and autocorrelation, methods to identifying and control for confounding variables, references for the Basic Principles section of Data Analysis.
Are We Able to Pass the Mission of Statistics to Students?
ERIC Educational Resources Information Center
Hindls, Richard; Hronová, Stanislava
2015-01-01
The article illustrates our long term experience in teaching statistics for non-statisticians, especially for students of economics and humanities. The article is focused on some problems of the basic course that can weaken the interest in statistics or lead to false use of statistic methods.
Osaba, E; Carballedo, R; Diaz, F; Onieva, E; de la Iglesia, I; Perallos, A
2014-01-01
Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test.
Osaba, E.; Carballedo, R.; Diaz, F.; Onieva, E.; de la Iglesia, I.; Perallos, A.
2014-01-01
Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test. PMID:25165731
Cognitive measure on different profiles.
Spindola, Marilda; Carra, Giovani; Balbinot, Alexandre; Zaro, Milton A
2010-01-01
Based on neurology and cognitive science many studies are developed to understand the human model mental, getting to know how human cognition works, especially about learning processes that involve complex contents and spatial-logical reasoning. Event Related Potential - ERP - is a basic and non-invasive method of electrophysiological investigation. It can be used to assess aspects of human cognitive processing by changing the rhythm of the frequency bands brain indicate that some type of processing or neuronal behavior. This paper focuses on ERP technique to help understand cognitive pathway in subjects from different areas of knowledge when they are exposed to an external visual stimulus. In the experiment we used 2D and 3D visual stimulus in the same picture. The signals were captured using 10 (ten) Electroencephalogram - EEG - channel system developed for this project and interfaced in a ADC (Analogical Digital System) board with LabVIEW system - National Instruments. That research was performed using project of experiments technique - DOE. The signal processing were done (math and statistical techniques) showing the relationship between cognitive pathway by groups and intergroups.
Lee, Seul Gi; Shin, Yun Hee
2016-04-01
This study was done to verify effects of a self-directed feedback practice using smartphone videos on nursing students' basic nursing skills, confidence in performance and learning satisfaction. In this study an experimental study with a post-test only control group design was used. Twenty-nine students were assigned to the experimental group and 29 to the control group. Experimental treatment was exchanging feedback on deficiencies through smartphone recorded videos of nursing practice process taken by peers during self-directed practice. Basic nursing skills scores were higher for all items in the experimental group compared to the control group, and differences were statistically significant ["Measuring vital signs" (t=-2.10, p=.039); "Wearing protective equipment when entering and exiting the quarantine room and the management of waste materials" (t=-4.74, p<.001) "Gavage tube feeding" (t=-2.70, p=.009)]. Confidence in performance was higher in the experimental group compared to the control group, but the differences were not statistically significant. However, after the complete practice, there was a statistically significant difference in overall performance confidence (t=-3.07. p=.003). Learning satisfaction was higher in the experimental group compared to the control group, but the difference was not statistically significant (t=-1.67, p=.100). Results of this study indicate that self-directed feedback practice using smartphone videos can improve basic nursing skills. The significance is that it can help nursing students gain confidence in their nursing skills for the future through improvement of basic nursing skills and performance of quality care, thus providing patients with safer care.
ERIC Educational Resources Information Center
Ahmad, Saira Ijaz; Malik, Samina; Irum, Jamila; Zahid, Rabia
2011-01-01
The main objective of the study was to identify the instructional methods and techniques used by the secondary school teachers to transfer the instructions to the students and to explore the basic considerations of the teachers about the selection of these instructional methods and techniques. Participants of the study included were 442 teachers…
Provision of Pre-Primary Education as a Basic Right in Tanzania: Reflections from Policy Documents
ERIC Educational Resources Information Center
Mtahabwa, Lyabwene
2010-01-01
This study sought to assess provision of pre-primary education in Tanzania as a basic right through analyses of relevant policy documents. Documents which were published over the past decade were considered, including educational policies, action plans, national papers, the "Basic Education Statistics in Tanzania" documents, strategy…
Improving basic surgical skills for final year medical students: the value of a rural weekend.
House, A K; House, J
2000-05-01
Hospitals employing medical graduates often express concern at the inexperience of new interns in basic surgical skills. In self assessment questionnaires, our senior medical students reported little clinical procedural experience. A practical skills workshop was staged in order to set learning goals for the final study year. This gave the students an opportunity to learn, revise and practice basic surgical techniques. The Bruce Rock rural community sponsored a surgical camp at the beginning of the academic year. Ninety-five (80%) of the class registered at the workshop, which rotated them through teaching modules, with private study opportunities and the capacity to cater for varied skill levels. Eight teaching stations with multiple access points were provided, and ten mock trauma scenarios were staged to augment the learning process. The teaching weekend was rated by students on an evaluative entrance and exit questionnaire. Sixty-five (73%) students returned questionnaires. They recorded significant improvement (P < 0.05) in their ability to handle the teaching stations. All students had inserted intravenous lines in practice prior to the camp, so the rating change in intravenous line insertion ability was not statistically significant. The weekend retreat offers students a chance to focus on surgical skills, free from the pressures of a clinical setting or the classroom. The emphasis was on the value of practice and primary skills learning. Students endorsed the camp as relevant, practical and an enjoyable learning experience for basic surgical skills.
Progress in Turbulence Detection via GNSS Occultation Data
NASA Technical Reports Server (NTRS)
Cornman, L. B.; Goodrich, R. K.; Axelrad, P.; Barlow, E.
2012-01-01
The increased availability of radio occultation (RO) data offers the ability to detect and study turbulence in the Earth's atmosphere. An analysis of how RO data can be used to determine the strength and location of turbulent regions is presented. This includes the derivation of a model for the power spectrum of the log-amplitude and phase fluctuations of the permittivity (or index of refraction) field. The bulk of the paper is then concerned with the estimation of the model parameters. Parameter estimators are introduced and some of their statistical properties are studied. These estimators are then applied to simulated log-amplitude RO signals. This includes the analysis of global statistics derived from a large number of realizations, as well as case studies that illustrate various specific aspects of the problem. Improvements to the basic estimation methods are discussed, and their beneficial properties are illustrated. The estimation techniques are then applied to real occultation data. Only two cases are presented, but they illustrate some of the salient features inherent in real data.
Round-off errors in cutting plane algorithms based on the revised simplex procedure
NASA Technical Reports Server (NTRS)
Moore, J. E.
1973-01-01
This report statistically analyzes computational round-off errors associated with the cutting plane approach to solving linear integer programming problems. Cutting plane methods require that the inverse of a sequence of matrices be computed. The problem basically reduces to one of minimizing round-off errors in the sequence of inverses. Two procedures for minimizing this problem are presented, and their influence on error accumulation is statistically analyzed. One procedure employs a very small tolerance factor to round computed values to zero. The other procedure is a numerical analysis technique for reinverting or improving the approximate inverse of a matrix. The results indicated that round-off accumulation can be effectively minimized by employing a tolerance factor which reflects the number of significant digits carried for each calculation and by applying the reinversion procedure once to each computed inverse. If 18 significant digits plus an exponent are carried for each variable during computations, then a tolerance value of 0.1 x 10 to the minus 12th power is reasonable.
Notes on power of normality tests of error terms in regression models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Střelec, Luboš
2015-03-10
Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importancemore » of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.« less
A crash course on data analysis in asteroseismology
NASA Astrophysics Data System (ADS)
Appourchaux, Thierry
2014-02-01
In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.
NASA Technical Reports Server (NTRS)
Manning, Robert M.
2002-01-01
The work presented here formulates the rigorous statistical basis for the correct estimation of communication link SNR of a BPSK, QPSK, and for that matter, any M-ary phase-modulated digital signal from what is known about its statistical behavior at the output of the receiver demodulator. Many methods to accomplish this have been proposed and implemented in the past but all of them are based on tacit and unwarranted assumptions and are thus defective. However, the basic idea is well founded, i.e., the signal at the output of a communications demodulator has convolved within it the prevailing SNR characteristic of the link. The acquisition of the SNR characteristic is of the utmost importance to a communications system that must remain reliable in adverse propagation conditions. This work provides a correct and consistent mathematical basis for the proper statistical 'deconvolution' of the output of a demodulator to yield a measure of the SNR. The use of such techniques will alleviate the need and expense for a separate propagation link to assess the propagation conditions prevailing on the communications link. Furthermore, they are applicable for every situation involving the digital transmission of data over planetary and space communications links.
On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics
NASA Astrophysics Data System (ADS)
Busch, Paul; Quadt, Ralf
1990-10-01
Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.
Computer Program For Linear Algebra
NASA Technical Reports Server (NTRS)
Krogh, F. T.; Hanson, R. J.
1987-01-01
Collection of routines provided for basic vector operations. Basic Linear Algebra Subprogram (BLAS) library is collection from FORTRAN-callable routines for employing standard techniques to perform basic operations of numerical linear algebra.
Reddy, Pramod P; Reddy, Trisha P; Roig-Francoli, Jennifer; Cone, Lois; Sivan, Bezalel; DeFoor, W Robert; Gaitonde, Krishnanath; Noh, Paul H
2011-10-01
One of the main ergonomic challenges during surgical procedures is surgeon posture. There have been reports of a high number of work related injuries in laparoscopic surgeons. The Alexander technique is a process of psychophysical reeducation of the body to improve postural balance and coordination, permitting movement with minimal strain and maximum ease. We evaluated the efficacy of the Alexander technique in improving posture and surgical ergonomics during minimally invasive surgery. We performed a prospective cohort study in which subjects served as their own controls. Informed consent was obtained. Before Alexander technique instruction/intervention subjects underwent assessment of postural coordination and basic laparoscopic skills. All subjects were educated about the Alexander technique and underwent post-instruction/intervention assessment of posture and laparoscopic skills. Subjective and objective data obtained before and after instruction/intervention were tabulated and analyzed for statistical significance. All 7 subjects completed the study. Subjects showed improved ergonomics and improved ability to complete FLS™ as well as subjective improvement in overall posture. The Alexander technique training program resulted in a significant improvement in posture. Improved surgical ergonomics, endurance and posture decrease surgical fatigue and the incidence of repetitive stress injuries to laparoscopic surgeons. Further studies of the influence of the Alexander technique on surgical posture, minimally invasive surgery ergonomics and open surgical techniques are warranted to explore and validate the benefits for surgeons. Copyright © 2011 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Center for Prostate Disease Research
... 2017 Cancer Statistics programs Clinical Research Program Synopsis Leadership Multi-Disciplinary Clinic Staff Listing 2017 Cancer Statistics Basic Science Research Program Synopsis Leadership Gene Expression Data Research Achievements Staff Listing Lab ...
Basic Aerospace Education Library
ERIC Educational Resources Information Center
Journal of Aerospace Education, 1975
1975-01-01
Lists the most significant resource items on aerospace education which are presently available. Includes source books, bibliographies, directories, encyclopedias, dictionaries, audiovisuals, curriculum/planning guides, aerospace statistics, aerospace education statistics and newsletters. (BR)
Multiple-solution problems in a statistics classroom: an example
NASA Astrophysics Data System (ADS)
Chu, Chi Wing; Chan, Kevin L. T.; Chan, Wai-Sum; Kwong, Koon-Shing
2017-11-01
The mathematics education literature shows that encouraging students to develop multiple solutions for given problems has a positive effect on students' understanding and creativity. In this paper, we present an example of multiple-solution problems in statistics involving a set of non-traditional dice. In particular, we consider the exact probability mass distribution for the sum of face values. Four different ways of solving the problem are discussed. The solutions span various basic concepts in different mathematical disciplines (sample space in probability theory, the probability generating function in statistics, integer partition in basic combinatorics and individual risk model in actuarial science) and thus promotes upper undergraduate students' awareness of knowledge connections between their courses. All solutions of the example are implemented using the R statistical software package.
Maloney, Tim; Jiang, Nan; Putnam-Hornstein, Emily; Dalton, Erin; Vaithianathan, Rhema
2017-03-01
Introduction Official statistics have confirmed that relative to their presence in the population and relative to white children, black children have consistently higher rates of contact with child protective services (CPS). We used linked administrative data and statistical decomposition techniques to generate new insights into black and white differences in child maltreatment reports and foster care placements. Methods Birth records for all children born in Allegheny County, Pennsylvania, between 2008 and 2010 were linked to administrative service records originating in multiple county data systems. Differences in rates of involvement with child protective services between black and white children by age 4 were decomposed using nonlinear regression techniques. Results Black children had rates of CPS involvement that were 3 times higher than white children. Racial differences were explained solely by parental marital status (i.e., being unmarried) and age at birth (i.e., predominantly teenage mothers). Adding other covariates did not capture any further racial differences in maltreatment reporting or foster care placement rates, they simply shifted differences already explained by marital status and age to these other variables. Discussion Racial differences in rates of maltreatment reports and foster care placements can be explained by a basic model that adjusts only for parental marital status and age at the time of birth. Increasing access to early prevention services for vulnerable families may reduce disparities in child protective service involvement. Using birth records linked to other administrative data sources provides an important means to developing population-based research.
The Statistical Power of Planned Comparisons.
ERIC Educational Resources Information Center
Benton, Roberta L.
Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…
Operations analysis (study 2.1): Program manual and users guide for the LOVES computer code
NASA Technical Reports Server (NTRS)
Wray, S. T., Jr.
1975-01-01
Information is provided necessary to use the LOVES Computer Program in its existing state, or to modify the program to include studies not properly handled by the basic model. The Users Guide defines the basic elements assembled together to form the model for servicing satellites in orbit. As the program is a simulation, the method of attack is to disassemble the problem into a sequence of events, each occurring instantaneously and each creating one or more other events in the future. The main driving force of the simulation is the deterministic launch schedule of satellites and the subsequent failure of the various modules which make up the satellites. The LOVES Computer Program uses a random number generator to simulate the failure of module elements and therefore operates over a long span of time typically 10 to 15 years. The sequence of events is varied by making several runs in succession with different random numbers resulting in a Monte Carlo technique to determine statistical parameters of minimum value, average value, and maximum value.
Tighe, Elizabeth L.; Schatschneider, Christopher
2015-01-01
The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in Adult Basic Education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. PMID:25351773
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 5 2010-07-01 2010-07-01 false Requests from the Bureau of Labor Statistics for data. 1904... Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses Form from the Bureau of Labor Statistics (BLS), or a BLS designee, you must promptly complete the form...
78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...
Some Basic Techniques in Bioimpedance Research
NASA Astrophysics Data System (ADS)
Martinsen, Ørjan G.
2004-09-01
Any physiological or anatomical changes in a biological material will also change its electrical properties. Hence, bioimpedance measurements can be used for diagnosing or classification of tissue. Applications are numerous within medicine, biology, cosmetics, food industry, sports, etc, and different basic approaches for the development of bioimpedance techniques are discussed in this paper.
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Gentili, Stefania
2017-04-01
Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.
Whitley, Heather P; Parton, Jason M
2014-09-15
To adapt a classroom assessment technique (CAT) from an anthropology course to a diabetes module in a clinical pharmacy skills laboratory and to determine student knowledge retention from baseline. Diabetes item stems, focused on module objectives, replaced anthropology terms. Answer choices, coded to Bloom's Taxonomy, were expanded to include higher-order thinking. Students completed the online 5-item probe 4 times: prelaboratory lecture, postlaboratory, and at 6 months and 12 months after laboratory. Statistical analyses utilized a single factor, repeated measures design using rank transformations of means with a Mann-Whitney-Wilcoxon test. The CAT revealed a significant increase in knowledge from prelaboratory compared to all postlaboratory measurements (p<0.0001). Significant knowledge retention was maintained with basic terms, but declined with complex terms between 6 and 12 months. The anthropology assessment tool was effectively adapted using Bloom's Taxonomy as a guide and, when used repeatedly, demonstrated knowledge retention. Minimal time was devoted to application of the probe making it an easily adaptable CAT.
Analysing attitude data through ridit schemes.
El-rouby, M G
1994-12-02
The attitudes of individuals and populations on various issues are usually assessed through sample surveys. Responses to survey questions are then scaled and combined into a meaningful whole which defines the measured attitude. The applied scales may be of nominal, ordinal, interval, or ratio nature depending upon the degree of sophistication the researcher wants to introduce into the measurement. This paper discusses methods of analysis for categorical variables of the type used in attitude and human behavior research, and recommends adoption of ridit analysis, a technique which has been successfully applied to epidemiological, clinical investigation, laboratory, and microbiological data. The ridit methodology is described after reviewing some general attitude scaling methods and problems of analysis related to them. The ridit method is then applied to a recent study conducted to assess health care service quality in North Carolina. This technique is conceptually and computationally more simple than other conventional statistical methods, and is also distribution-free. Basic requirements and limitations on its use are indicated.
Saini, Harsh; Raicar, Gaurav; Dehzangi, Abdollah; Lal, Sunil; Sharma, Alok
2015-12-07
Protein subcellular localization is an important topic in proteomics since it is related to a protein׳s overall function, helps in the understanding of metabolic pathways, and in drug design and discovery. In this paper, a basic approximation technique from natural language processing called the linear interpolation smoothing model is applied for predicting protein subcellular localizations. The proposed approach extracts features from syntactical information in protein sequences to build probabilistic profiles using dependency models, which are used in linear interpolation to determine how likely is a sequence to belong to a particular subcellular location. This technique builds a statistical model based on maximum likelihood. It is able to deal effectively with high dimensionality that hinders other traditional classifiers such as Support Vector Machines or k-Nearest Neighbours without sacrificing performance. This approach has been evaluated by predicting subcellular localizations of Gram positive and Gram negative bacterial proteins. Copyright © 2015 Elsevier Ltd. All rights reserved.
Multispectral processing based on groups of resolution elements
NASA Technical Reports Server (NTRS)
Richardson, W.; Gleason, J. M.
1975-01-01
Several nine-point rules are defined and compared with previously studied rules. One of the rules performed well in boundary areas, but with reduced efficiency in field interiors; another combined best performance on field interiors with good sensitivity to boundary detail. The basic threshold gradient and some modifications were investigated as a means of boundary point detection. The hypothesis testing methods of closed-boundary formation were also tested and evaluated. An analysis of the boundary detection problem was initiated, employing statistical signal detection and parameter estimation techniques to analyze various formulations of the problem. These formulations permit the atmospheric and sensor system effects on the data to be thoroughly analyzed. Various boundary features and necessary assumptions can also be investigated in this manner.
M.S.L.A.P. Modular Spectral Line Analysis Program documentation
NASA Technical Reports Server (NTRS)
Joseph, Charles L.; Jenkins, Edward B.
1991-01-01
MSLAP is a software for analyzing spectra, providing the basic structure to identify spectral features, to make quantitative measurements of this features, and to store the measurements for convenient access. MSLAP can be used to measure not only the zeroth moment (equivalent width) of a profile, but also the first and second moments. Optical depths and the corresponding column densities across the profile can be measured as well for sufficiently high resolution data. The software was developed for an interactive, graphical analysis where the computer carries most of the computational and data organizational burden and the investigator is responsible only for all judgement decisions. It employs sophisticated statistical techniques for determining the best polynomial fit to the continuum and for calculating the uncertainties.
LANDSAT menhaden and thread herring resources investigation. [Gulf of Mexico
NASA Technical Reports Server (NTRS)
Kemmerer, A. J. (Principal Investigator); Brucks, J. T.; Butler, J. A.; Faller, K. H.; Holley, H. J.; Leming, T. D.; Savastano, K. J.; Vanselous, T. M.
1977-01-01
The author has identified the following significant results. The relationship between the distribution of menhaden and selected oceanographic parameters (water color, turbidity, and possibly chlorophyll concentrations) was established. Similar relationships for thread herring were not established nor were relationships relating to the abundance of either species. Use of aircraft and LANDSAT remote sensing instruments to measure or infer a set of basic oceanographic parameters was evaluated. Parameters which could be accurately inferred included surface water temperature, salinity, and color. Water turbidity (Secchi disk) was evaluated as marginally inferrable from the LANDSAT MSS data and chlorophyll-a concentrations as less than marginal. These evaluations considered the parameters only as experienced in the two test areas using available sensors and statistical techniques.
ERIC Educational Resources Information Center
Shihua, Peng; Rihui, Tan
2009-01-01
Employing statistical analysis, this study has made a preliminary exploration of promoting the equitable development of basic education in underdeveloped counties through the case study of Cili county. The unequally developed basic education in the county has been made clear, the reasons for the inequitable education have been analyzed, and,…
[Autopsies for anatomical teaching and training in clinical forensic medicine].
Hammer, U; Blaas, V; Büttner, A; Philipp, M
2015-12-01
Clinical forensic medicine does not only entail examination of patients after physical violence but also the option of clinical autopsies, e.g. after non-notifiable complications of medical interventions, after fatalities closely following medical interventions or fatalities as a result of injuries when the public prosecutor decides not to order a medicolegal autopsy. Based on this routine the Institute of Forensic Medicine at the University of Rostock offers a training course in topographical anatomy to physicians for further training in interventional and surgical disciplines. At the beginning of autopsies the participants can explore the approaches of interventional puncture techniques as well as surgical techniques and the basic topographical anatomy in small groups of 2-4 persons under the supervision of forensic examiners. The format is essentially oriented to the early further training period but fulfils the requirements for the exploration of complex operative techniques. The course was adapted for physicians and offered separately to students. The explorations are performed manually or by support with autopsy instruments. The courses offer an ideal room for individual, discipline-specific topics and result in a great benefit for all participants. A statistical assessment can only be achieved with a larger number of participants. Making autopsy rooms available for teaching and further training represents an additional feature to the profile of clinical forensic medicine. Lessons in topographical anatomy provide a great benefit for patient safety. It seems to be important to offer the opportunity to address individual interests in a closed meeting to consolidate skills and abilities in a non-judgemental environment. The post-mortem examiners have to ensure that the autopsy is carried out lege artis. Basic ethical principles and all regulations from an accredited scope have to be adhered to.
Educating the Educator: U.S. Government Statistical Sources for Geographic Research and Teaching.
ERIC Educational Resources Information Center
Fryman, James F.; Wilkinson, Patrick J.
Appropriate for college geography students and researchers, this paper briefly introduces basic federal statistical publications and corresponding finding aids. General references include "Statistical Abstract of the United States," and three complementary publications: "County and City Data Book,""State and Metropolitan Area Data Book," and…
Statistical Cost Estimation in Higher Education: Some Alternatives.
ERIC Educational Resources Information Center
Brinkman, Paul T.; Niwa, Shelley
Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
Ethical Statistics and Statistical Ethics: Making an Interdisciplinary Module
ERIC Educational Resources Information Center
Lesser, Lawrence M.; Nordenhaug, Erik
2004-01-01
This article describes an innovative curriculum module the first author created on the two-way exchange between statistics and applied ethics. The module, having no particular mathematical prerequisites beyond high school algebra, is part of an undergraduate interdisciplinary ethics course which begins with a 3-week introduction to basic applied…
Computer aided fringe pattern analysis
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.
The paper reviews the basic laws of fringe pattern interpretation. The different techniques that are currently utilized are presented using a common frame of reference stressing the fact that these techniques are different variations of the same basic principle. Digital and analog techniques are discussed. Currently available hardware is presented and the relationships between hardware and the operations of pattern fringe processing are pointed out. Examples are given to illustrate the ideas discussed in the paper.
ERIC Educational Resources Information Center
Orton, Larry
2009-01-01
This document outlines the definitions and the typology now used by Statistics Canada's Centre for Education Statistics to identify, classify and delineate the universities, colleges and other providers of postsecondary and adult education in Canada for which basic enrollments, graduates, professors and finance statistics are produced. These new…
ERIC Educational Resources Information Center
North, Delia; Gal, Iddo; Zewotir, Temesgen
2014-01-01
This paper aims to contribute to the emerging literature on capacity-building in statistics education by examining issues pertaining to the readiness of teachers in a developing country to teach basic statistical topics. The paper reflects on challenges and barriers to building statistics capacity at grass-roots level in a developing country,…
ERIC Educational Resources Information Center
Bott, Tina M.; Wan, Hayley
2013-01-01
Students sometimes have difficulty grasping the importance of when and how basic distillation techniques, column chromatography, TLC, and basic spectroscopy (IR and NMR) can be used to identify unknown compounds within a mixture. This two-part experiment uses mixtures of pleasant-smelling, readily available terpenoid compounds as unknowns to…
Multiple Uses of a Word Study Technique
ERIC Educational Resources Information Center
Joseph, Laurice M.; Orlins, Andrew
2005-01-01
This paper presents two case studies that illustrate the multiple uses of word sorts, a word study phonics technique. Case study children were Sara, a second grader, who had difficulty with reading basic words and John, a third grader, who had difficulty with spelling basic words. Multiple baseline designs were employed to study the effects of…
ERIC Educational Resources Information Center
Rouin, Carole
Presented are proceedings of a conference which focused on basic assessment and intervention techniques for use in the education and habilitation of lower functioning deaf blind and multihandicapped children. Following an introduction by D. Overbeck are papers with the following titles and authors: "Considerations in the Psychological Assessment…
Rodenbeck, Christopher T.; Tracey, Keith J.; Barkley, Keith R.; ...
2014-08-01
This paper introduces a technique for improving the sensitivity of RF subsamplers in radar and coherent receiver applications. The technique, referred to herein as “delta modulation” (DM), feeds the time-average output of a monobit analog-to-digital converter (ADC) back to the ADC input, but with opposite polarity. Assuming pseudo-stationary modulation statistics on the sampled RF waveform, the feedback signal corrects for aggregate DC offsets present in the ADC that otherwise degrade ADC sensitivity. Two RF integrated circuits (RFICs) are designed to demonstrate the approach. One uses analog DM to create the feedback signal; the other uses digital DM to achieve themore » same result. A series of tests validates the designs. The dynamic time-domain response confirms the feedback loop’s basic operation. Measured output quantization imbalance, under noise-only input drive, significantly improves with the use of the DM circuit, even for large, deliberately induced DC offsets and wide temperature variation from -55°C to +85 °C. Examination of the corrected vs. uncorrected baseband spectrum under swept input signal-tonoise ratio (SNR) conditions demonstrates the effectiveness of this approach for realistic radar and coherent receiver applications. In conclusion, two-tone testing shows no impact of the DM technique on ADC linearity.« less
Teaching of clinical ultrasonography to undergraduates: students as mentors.
García de Casasola Sánchez, G; González Peinado, D; Sánchez Gollarte, A; Muñoz Aceituno, E; Peña Vázquez, I; Torres Macho, J
2015-05-01
Ultrasonography is a highly useful diagnostic technique that supplements traditional physical examinations. To demonstrate that students previously trained in clinical ultrasonography are capable of instructing other students in a similar manner in a short period of time ("peer mentoring"). Five medical students in their 5th year, trained in abdominal and cardiac ultrasonography by physicians with experience, instructed 24 other students in the same procedure. The training consisted of an online theoretical course and practical training lasting about 12hours, in which each student had to perform 6 basic abdominal planes and 4 basic cardiac planes on 20 healthy volunteers. Subsequently, the students underwent an objective assessment test on healthy models performed by expert physicians in clinical ultrasonography. The students managed to correctly identify 90.2% of the basic abdominal planes, except for the left coronal (spleen and left kidney) and subcostal (gallbladder) planes, with slightly lower success rates of 82.5% and 80%, respectively. Due to the greater difficulty of obtaining cardiac planes, the success rate was lower: 70.3%, in the subxiphoid, short parasternal and four chamber planes. The cardiac plane with the fewest errors in identification was the parasternal long plane (90% success). We observed no statistically significant differences between the results (teaching capacity) of the various mentors. Medical students are capable of instructing other colleagues (peer mentoring) on the basic aspects of abdominal and cardiac ultrasonography after a relatively short training period. Copyright © 2014 Elsevier España, S.L.U. y Sociedad Española de Medicina Interna (SEMI). All rights reserved.
Causality and headache triggers
Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.
2013-01-01
Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872
County-by-County Financial and Staffing I-M-P-A-C-T. FY 1994-95 Basic Education Program.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh.
This publication provides the basic statistics needed to illustrate the impact of North Carolina's Basic Education Program (BEP), an educational reform effort begun in 1985. Over 85% of the positions in the BEP are directly related to teaching and student-related activities. The new BEP programs result in smaller class sizes in kindergartens and…
Minimum Information about a Genotyping Experiment (MIGEN)
Huang, Jie; Mirel, Daniel; Pugh, Elizabeth; Xing, Chao; Robinson, Peter N.; Pertsemlidis, Alexander; Ding, LiangHao; Kozlitina, Julia; Maher, Joseph; Rios, Jonathan; Story, Michael; Marthandan, Nishanth; Scheuermann, Richard H.
2011-01-01
Genotyping experiments are widely used in clinical and basic research laboratories to identify associations between genetic variations and normal/abnormal phenotypes. Genotyping assay techniques vary from single genomic regions that are interrogated using PCR reactions to high throughput assays examining genome-wide sequence and structural variation. The resulting genotype data may include millions of markers of thousands of individuals, requiring various statistical, modeling or other data analysis methodologies to interpret the results. To date, there are no standards for reporting genotyping experiments. Here we present the Minimum Information about a Genotyping Experiment (MIGen) standard, defining the minimum information required for reporting genotyping experiments. MIGen standard covers experimental design, subject description, genotyping procedure, quality control and data analysis. MIGen is a registered project under MIBBI (Minimum Information for Biological and Biomedical Investigations) and is being developed by an interdisciplinary group of experts in basic biomedical science, clinical science, biostatistics and bioinformatics. To accommodate the wide variety of techniques and methodologies applied in current and future genotyping experiment, MIGen leverages foundational concepts from the Ontology for Biomedical Investigations (OBI) for the description of the various types of planned processes and implements a hierarchical document structure. The adoption of MIGen by the research community will facilitate consistent genotyping data interpretation and independent data validation. MIGen can also serve as a framework for the development of data models for capturing and storing genotyping results and experiment metadata in a structured way, to facilitate the exchange of metadata. PMID:22180825
Basic Radar Altimetry Toolbox: Tools to Use Radar Altimetry for Geodesy
NASA Astrophysics Data System (ADS)
Rosmorduc, V.; Benveniste, J. J.; Bronner, E.; Niejmeier, S.
2010-12-01
Radar altimetry is very much a technique expanding its applications and uses. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for geodesy, especially combined witht ESA GOCE mission data is still somehow hard. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed (as well as, on ESA side, the GOCE User Toolbox, both being linked). The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are ongoing, some are in discussion. Examples and Data use cases on geodesy will be presented. BRAT is developed under contract with ESA and CNES.
Thyroid Radiofrequency Ablation: Updates on Innovative Devices and Techniques
Park, Hye Sun; Park, Auh Whan; Chung, Sae Rom; Choi, Young Jun; Lee, Jeong Hyun
2017-01-01
Radiofrequency ablation (RFA) is a well-known, effective, and safe method for treating benign thyroid nodules and recurrent thyroid cancers. Thyroid-dedicated devices and basic techniques for thyroid RFA were introduced by the Korean Society of Thyroid Radiology (KSThR) in 2012. Thyroid RFA has now been adopted worldwide, with subsequent advances in devices and techniques. To optimize the treatment efficacy and patient safety, understanding the basic and advanced RFA techniques and selecting the optimal treatment strategy are critical. The goal of this review is to therefore provide updates and analysis of current devices and advanced techniques for RFA treatment of benign thyroid nodules and recurrent thyroid cancers. PMID:28670156
Witt-Enderby, Paula A.; Johnson, David A.; Anderson, Carl A.; Bricker, J. Douglas; Davis, Vicki L.; Firestine, Steven M.; Meng, Wilson S.
2006-01-01
To provide graduate students in pharmacology/toxicology exposure to, and cross-training in, a variety of relevant laboratory skills, the Duquesne University School of Pharmacy developed a “methods” course as part of the core curriculum. Because some of the participating departmental faculty are neuroscientists, this course often applied cutting-edge techniques to neuroscience-based systems, including experiments with brain G protein–coupled receptors. Techniques covered by the course include animal handling and behavioral testing, bacterial and mammalian cell culture, enzyme-linked immunosorbent assay, western blotting, receptor binding of radioligands, plasmid DNA amplification and purification, reverse transcriptase-polymerase chain reaction, gel electrophoresis, and UV-visible and fluorescence spectroscopy. The course also encompasses research aspects such as experimental design and record keeping, statistical analysis, and scientific writing. Students were evaluated via laboratory reports and examinations, and students in turn evaluated the course using a detailed exit survey. This course introduces the graduate student to many more techniques and approaches than can be provided by the traditional graduate “rotation” format alone and should serve as a template for graduate programs in many basic research disciplines. PMID:17012209
A Review of Correlated Noise in Exoplanet Light Curves
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Harrington, J.; Blecic, J.; Hardy, R. A.; Hardin, M.
2013-10-01
A number of the occultation light curves of exoplanets exhibit time-correlated residuals (a.k.a. correlated or red noise) in their model fits. The correlated noise might arise from inaccurate models or unaccounted astrophysical or telescope systematics. A correct assessment of the correlated noise is important to determine true signal-to-noise ratios of a planet's physical parameters. Yet, there are no in-depth statistical studies in the literature for some of the techniques currently used (RMS-vs-bin size plot, prayer beads, and wavelet-based modeling). We subjected these correlated-noise assessment techniques to basic tests on synthetic data sets to characterize their features and limitations. Initial results indicate, for example, that, sometimes the RMS-vs-bin size plots present artifacts when the bin size is similar to the observation duration. Further, the prayer beads doesn't correctly increase the uncertainties to compensate for the lack of accuracy if there is correlated noise. We have applied these techniques to several Spitzer secondary-eclipse hot-Jupiter light curves and discuss their implications. This work was supported in part by NASA planetary atmospheres grant NNX13AF38G and Astrophysics Data Analysis Program NNX12AI69G.
Pandey, Shilpa; Hakky, Michael; Kwak, Ellie; Jara, Hernan; Geyer, Carl A; Erbay, Sami H
2013-05-01
Neurovascular imaging studies are routinely used for the assessment of headaches and changes in mental status, stroke workup, and evaluation of the arteriovenous structures of the head and neck. These imaging studies are being performed with greater frequency as the aging population continues to increase. Magnetic resonance (MR) angiographic imaging techniques are helpful in this setting. However, mastering these techniques requires an in-depth understanding of the basic principles of physics, complex flow patterns, and the correlation of MR angiographic findings with conventional MR imaging findings. More than one imaging technique may be used to solve difficult cases, with each technique contributing unique information. Unfortunately, incorporating findings obtained with multiple imaging modalities may add to the diagnostic challenge. To ensure diagnostic accuracy, it is essential that the radiologist carefully evaluate the details provided by these modalities in light of basic physics principles, the fundamentals of various imaging techniques, and common neurovascular imaging pitfalls. ©RSNA, 2013.
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2005-10-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.
ERIC Educational Resources Information Center
Center for Education Statistics (ED/OERI), Washington, DC.
The Financial Statistics machine-readable data file (MRDF) is a subfile of the larger Higher Education General Information Survey (HEGIS). It contains basic financial statistics for over 3,000 institutions of higher education in the United States and its territories. The data are arranged sequentially by institution, with institutional…
The Greyhound Strike: Using a Labor Dispute to Teach Descriptive Statistics.
ERIC Educational Resources Information Center
Shatz, Mark A.
1985-01-01
A simulation exercise of a labor-management dispute is used to teach psychology students some of the basics of descriptive statistics. Using comparable data sets generated by the instructor, students work in small groups to develop a statistical presentation that supports their particular position in the dispute. (Author/RM)
Tzonev, Svilen
2018-01-01
Current commercially available digital PCR (dPCR) systems and assays are capable of detecting individual target molecules with considerable reliability. As tests are developed and validated for use on clinical samples, the need to understand and develop robust statistical analysis routines increases. This chapter covers the fundamental processes and limitations of detecting and reporting on single molecule detection. We cover the basics of quantification of targets and sources of imprecision. We describe the basic test concepts: sensitivity, specificity, limit of blank, limit of detection, and limit of quantification in the context of dPCR. We provide basic guidelines how to determine those, how to choose and interpret the operating point, and what factors may influence overall test performance in practice.
Human-computer dialogue: Interaction tasks and techniques. Survey and categorization
NASA Technical Reports Server (NTRS)
Foley, J. D.
1983-01-01
Interaction techniques are described. Six basic interaction tasks, requirements for each task, requirements related to interaction techniques, and a technique's hardware prerequisites affective device selection are discussed.
Nurses' foot care activities in home health care.
Stolt, Minna; Suhonen, Riitta; Puukka, Pauli; Viitanen, Matti; Voutilainen, Päivi; Leino-Kilpi, Helena
2013-01-01
This study described the basic foot care activities performed by nurses and factors associated with these in the home care of older people. Data were collected from nurses (n=322) working in nine public home care agencies in Finland using the Nurses' Foot Care Activities Questionnaire (NFAQ). Data were analyzed statistically using descriptive statistics and multivariate liner models. Although some of the basic foot care activities of nurses reported using were outdated, the majority of foot care activities were consistent with recommendations in foot care literature. Longer working experience, referring patients with foot problems to a podiatrist and physiotherapist, and patient education in wart and nail care were associated with a high score for adequate foot care activities. Continuing education should focus on updating basic foot care activities and increasing the use of evidence-based foot care methods. Also, geriatric nursing research should focus in intervention research to improve the use of evidence-based basic foot care activities. Copyright © 2013 Mosby, Inc. All rights reserved.
Introduction to the physics and techniques of remote sensing
NASA Technical Reports Server (NTRS)
Elachi, Charles
1987-01-01
This book presents a comprehensive overview of the basics behind remote-sensing physics, techniques, and technology. The physics of wave/matter interactions, techniques of remote sensing across the electromagnetic spectrum, and the concepts behind remote sensing techniques now established and future ones under development are discussed. Applications of remote sensing are described for a wide variety of earth and planetary atmosphere and surface sciences. Solid surface sensing across the electromagnetic spectrum, ocean surface sensing, basic principles of atmospheric sensing and radiative transfer, and atmospheric remote sensing in the microwave, millimeter, submillimeter, and infrared regions are examined.
Regression: The Apple Does Not Fall Far From the Tree.
Vetter, Thomas R; Schober, Patrick
2018-05-15
Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.
Random noise effects in pulse-mode digital multilayer neural networks.
Kim, Y C; Shanblatt, M A
1995-01-01
A pulse-mode digital multilayer neural network (DMNN) based on stochastic computing techniques is implemented with simple logic gates as basic computing elements. The pulse-mode signal representation and the use of simple logic gates for neural operations lead to a massively parallel yet compact and flexible network architecture, well suited for VLSI implementation. Algebraic neural operations are replaced by stochastic processes using pseudorandom pulse sequences. The distributions of the results from the stochastic processes are approximated using the hypergeometric distribution. Synaptic weights and neuron states are represented as probabilities and estimated as average pulse occurrence rates in corresponding pulse sequences. A statistical model of the noise (error) is developed to estimate the relative accuracy associated with stochastic computing in terms of mean and variance. Computational differences are then explained by comparison to deterministic neural computations. DMNN feedforward architectures are modeled in VHDL using character recognition problems as testbeds. Computational accuracy is analyzed, and the results of the statistical model are compared with the actual simulation results. Experiments show that the calculations performed in the DMNN are more accurate than those anticipated when Bernoulli sequences are assumed, as is common in the literature. Furthermore, the statistical model successfully predicts the accuracy of the operations performed in the DMNN.
Debecker, Damien P; Gaigneaux, Eric M; Busca, Guido
2009-01-01
Basic catalysis! The basic properties of hydrotalcites (see picture) make them attractive for numerous catalytic applications. Probing the basicity of the catalysts is crucial to understand the base-catalysed processes and to optimise the catalyst preparation. Various parameters can be employed to tune the basic properties of hydrotalcite-based catalysts towards the basicity demanded by each target chemical reaction.Hydrotalcites offer unique basic properties that make them very attractive for catalytic applications. It is of primary interest to make use of accurate tools for probing the basicity of hydrotalcite-based catalysts for the purpose of 1) fundamental understanding of base-catalysed processes with hydrotalcites and 2) optimisation of the catalytic performance achieved in reactions of industrial interest. Techniques based on probe molecules, titration techniques and test reactions along with physicochemical characterisation are overviewed in the first part of this review. The aim is to provide the tools for understanding how series of parameters involved in the preparation of hydrotalcite-based catalytic materials can be employed to control and adapt the basic properties of the catalyst towards the basicity demanded by each target chemical reaction. An overview of recent and significant achievements in that perspective is presented in the second part of the paper.
Densely calculated facial soft tissue thickness for craniofacial reconstruction in Chinese adults.
Shui, Wuyang; Zhou, Mingquan; Deng, Qingqiong; Wu, Zhongke; Ji, Yuan; Li, Kang; He, Taiping; Jiang, Haiyan
2016-09-01
Craniofacial reconstruction (CFR) is used to recreate a likeness of original facial appearance for an unidentified skull; this technique has been applied in both forensics and archeology. Many CFR techniques rely on the average facial soft tissue thickness (FSTT) of anatomical landmarks, related to ethnicity, age, sex, body mass index (BMI), etc. Previous studies typically employed FSTT at sparsely distributed anatomical landmarks, where different landmark definitions may affect the contrasting results. In the present study, a total of 90,198 one-to-one correspondence skull vertices are established on 171 head CT-scans and the FSTT of each corresponding vertex is calculated (hereafter referred to as densely calculated FSTT) for statistical analysis and CFR. Basic descriptive statistics (i.e., mean and standard deviation) for densely calculated FSTT are reported separately according to sex and age. Results show that 76.12% of overall vertices indicate that the FSTT is greater in males than females, with the exception of vertices around the zygoma, zygomatic arch and mid-lateral orbit. These sex-related significant differences are found at 55.12% of all vertices and the statistically age-related significant differences are depicted between the three age groups at a majority of all vertices (73.31% for males and 63.43% for females). Five non-overlapping categories are given and the descriptive statistics (i.e., mean, standard deviation, local standard deviation and percentage) are reported. Multiple appearances are produced using the densely calculated FSTT of various age and sex groups, and a quantitative assessment is provided to examine how relevant the choice of FSTT is to increasing the accuracy of CFR. In conclusion, this study provides a new perspective in understanding the distribution of FSTT and the construction of a new densely calculated FSTT database for craniofacial reconstruction. Copyright © 2016. Published by Elsevier Ireland Ltd.
Radio techniques for probing the terrestrial ionosphere.
NASA Astrophysics Data System (ADS)
Hunsucker, R. D.
The subject of the book is a description of the basic principles of operation, plus the capabilities and limitations of all generic radio techniques employed to investigate the terrestrial ionosphere. The purpose of this book is to present to the reader a balanced treatment of each technique so they can understand how to interpret ionospheric data and decide which techniques are most effective for studying specific phenomena. The first two chapters outline the basic theory underlying the techniques, and each following chapter discusses a separate technique. This monograph is entirely devoted to techniques in aeronomy and space physics. The approach is unique in its presentation of the principles, capabilities and limitations of the most important presently used radio techniques. Typical examples of data are shown for the various techniques, and a brief historical account of the technique development is presented. An extended annotated bibliography of the salient papers in the field is included.
Monte Carlo investigation of thrust imbalance of solid rocket motor pairs
NASA Technical Reports Server (NTRS)
Sforzini, R. H.; Foster, W. A., Jr.
1976-01-01
The Monte Carlo method of statistical analysis is used to investigate the theoretical thrust imbalance of pairs of solid rocket motors (SRMs) firing in parallel. Sets of the significant variables are selected using a random sampling technique and the imbalance calculated for a large number of motor pairs using a simplified, but comprehensive, model of the internal ballistics. The treatment of burning surface geometry allows for the variations in the ovality and alignment of the motor case and mandrel as well as those arising from differences in the basic size dimensions and propellant properties. The analysis is used to predict the thrust-time characteristics of 130 randomly selected pairs of Titan IIIC SRMs. A statistical comparison of the results with test data for 20 pairs shows the theory underpredicts the standard deviation in maximum thrust imbalance by 20% with variability in burning times matched within 2%. The range in thrust imbalance of Space Shuttle type SRM pairs is also estimated using applicable tolerances and variabilities and a correction factor based on the Titan IIIC analysis.
[Basic concepts for network meta-analysis].
Catalá-López, Ferrán; Tobías, Aurelio; Roqué, Marta
2014-12-01
Systematic reviews and meta-analyses have long been fundamental tools for evidence-based clinical practice. Initially, meta-analyses were proposed as a technique that could improve the accuracy and the statistical power of previous research from individual studies with small sample size. However, one of its main limitations has been the fact of being able to compare no more than two treatments in an analysis, even when the clinical research question necessitates that we compare multiple interventions. Network meta-analysis (NMA) uses novel statistical methods that incorporate information from both direct and indirect treatment comparisons in a network of studies examining the effects of various competing treatments, estimating comparisons between many treatments in a single analysis. Despite its potential limitations, NMA applications in clinical epidemiology can be of great value in situations where there are several treatments that have been compared against a common comparator. Also, NMA can be relevant to a research or clinical question when many treatments must be considered or when there is a mix of both direct and indirect information in the body of evidence. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.
A Predictive Approach to Network Reverse-Engineering
NASA Astrophysics Data System (ADS)
Wiggins, Chris
2005-03-01
A central challenge of systems biology is the ``reverse engineering" of transcriptional networks: inferring which genes exert regulatory control over which other genes. Attempting such inference at the genomic scale has only recently become feasible, via data-intensive biological innovations such as DNA microrrays (``DNA chips") and the sequencing of whole genomes. In this talk we present a predictive approach to network reverse-engineering, in which we integrate DNA chip data and sequence data to build a model of the transcriptional network of the yeast S. cerevisiae capable of predicting the response of genes in unseen experiments. The technique can also be used to extract ``motifs,'' sequence elements which act as binding sites for regulatory proteins. We validate by a number of approaches and present comparison of theoretical prediction vs. experimental data, along with biological interpretations of the resulting model. En route, we will illustrate some basic notions in statistical learning theory (fitting vs. over-fitting; cross- validation; assessing statistical significance), highlighting ways in which physicists can make a unique contribution in data- driven approaches to reverse engineering.
Heartwood and sapwood in eucalyptus trees: non-conventional approach to wood quality.
Cherelli, Sabrina G; Sartori, Maria Márcia P; Próspero, André G; Ballarin, Adriano W
2018-01-01
This study evaluated the quality of heartwood and sapwood from mature trees of three species of Eucalyptus, by means of the qualification of their proportion, determination of basic and apparent density using non-destructive attenuation of gamma radiation technique and calculation of the density uniformity index. Six trees of each species (Eucalyptus grandis - 18 years old, Eucalyptus tereticornis - 35 years old and Corymbia citriodora - 28 years old) were used in the experimental program. The heartwood and sapwood were delimited by macroscopic analysis and the calculation of areas and percentage of heartwood and sapwood were performed using digital image. The uniformity index was calculated following methodology which numerically quantifies the dispersion of punctual density values of the wood around the mean density along the radius. The percentage of the heartwood was higher than the sapwood in all species studied. The density results showed no statistical difference between heartwood and sapwood. Differently from the density results, in all species studied there was statistical differences between uniformity indexes for heartwood and sapwood regions, making justifiable the inclusion of the density uniformity index as a quality parameter for Eucalyptus wood.
Developing Competency of Teachers in Basic Education Schools
ERIC Educational Resources Information Center
Yuayai, Rerngrit; Chansirisira, Pacharawit; Numnaphol, Kochaporn
2015-01-01
This study aims to develop competency of teachers in basic education schools. The research instruments included the semi-structured in-depth interview form, questionnaire, program developing competency, and evaluation competency form. The statistics used for data analysis were percentage, mean, and standard deviation. The research found that…
NASA Technical Reports Server (NTRS)
Haefner, L. E.
1975-01-01
Mathematical and philosophical approaches are presented for evaluation and implementation of ground and air transportation systems. Basic decision processes are examined that are used for cost analyses and planning (i.e, statistical decision theory, linear and dynamic programming, optimization, game theory). The effects on the environment and the community that a transportation system may have are discussed and modelled. Algorithmic structures are examined and selected bibliographic annotations are included. Transportation dynamic models were developed. Citizen participation in transportation projects (i.e, in Maryland and Massachusetts) is discussed. The relevance of the modelling and evaluation approaches to air transportation (i.e, airport planning) is examined in a case study in St. Louis, Missouri.
Linear combination reading program for capture gamma rays
Tanner, Allan B.
1971-01-01
This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).
Basic technique for solid lesions: Cytology, core, or both?
Hébert-Magee, Shantel
2014-01-01
This chapter highlights key fundamentals relevant to post-procurement tissue handling of materials obtains by aspiration and/or biopsy and details the subtle techniques that can significantly impact patient management and practice patterns. A basic knowledge of tissue handling and processing is imperative for endosonographers who attempt to achieve a greater than 95% diagnostic accuracy with their tissue-acquisition procedures. PMID:24949408
ERIC Educational Resources Information Center
DETERLINE, WILLIAM A.
A PROGRAMED COURSE IN METHODS AND TECHNIQUES OF PREPARING PROGRAMED INSTRUCTIONAL MATERIALS WAS PRESENTED IN THIS DOCUMENT. AN ATTEMPT WAS MADE TO TEACH BASIC PROCEDURES WELL ENOUGH TO PRODUCE AN EMBRYO PROGRAMER AND TO PROVIDE HIM WITH REFERENCES HE WOULD NEED IN ORDER TO PRODUCE PROGRAMS. INCLUDED WERE PROGRAMED INSTRUCTIONS ON PREPARATORY…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working the elementary schools of Cordoba, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Narino, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Cauca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Caldas, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Boyaca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Huila, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…
ERIC Educational Resources Information Center
National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.
This report is a part of the program of the National Center for Health Statistics to provide current statistics as baseline data for the evaluation, planning, and administration of health programs. Part I presents data concerning the occupational fields: (1) administration, (2) anthropology and sociology, (3) data processing, (4) basic sciences,…
ERIC Educational Resources Information Center
Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.
This document provides statistical data on the distribution and education of teacher personnel working in Colombian elementary schools between 1940 and 1968. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of teachers. (VM)
Explorations in Statistics: Standard Deviations and Standard Errors
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2008-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This series in "Advances in Physiology Education" provides an opportunity to do just that: we will investigate basic concepts in statistics using the free software package R. Because this series uses R solely as a vehicle…
ERIC Educational Resources Information Center
Cassel, Russell N.
This paper relates educational and psychological statistics to certain "Research Statistical Tools" (RSTs) necessary to accomplish and understand general research in the behavioral sciences. Emphasis is placed on acquiring an effective understanding of the RSTs and to this end they are are ordered to a continuum scale in terms of individual…
Estimates of School Statistics, 1971-72.
ERIC Educational Resources Information Center
Flanigan, Jean M.
This report presents public school statistics for the 50 States, the District of Columbia, and the regions and outlying areas of the United States. The text presents national data for each of the past 10 years and defines the basic series of statistics. Tables present the revised estimates by State and region for 1970-71 and the preliminary…
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876
Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.
Senior Computational Scientist | Center for Cancer Research
The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results
Brennan, Jennifer Sousa
2010-01-01
This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.
System analysis for the Huntsville Operational Support Center distributed computer system
NASA Technical Reports Server (NTRS)
Ingels, E. M.
1983-01-01
A simulation model was developed and programmed in three languages BASIC, PASCAL, and SLAM. Two of the programs are included in this report, the BASIC and the PASCAL language programs. SLAM is not supported by NASA/MSFC facilities and hence was not included. The statistical comparison of simulations of the same HOSC system configurations are in good agreement and are in agreement with the operational statistics of HOSC that were obtained. Three variations of the most recent HOSC configuration was run and some conclusions drawn as to the system performance under these variations.
NASA Astrophysics Data System (ADS)
Haven, Emmanuel; Khrennikov, Andrei
2013-01-01
Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.
NASA Astrophysics Data System (ADS)
Cardall, Christian Y.; Budiardja, Reuben D.
2017-05-01
GenASiS Basics provides Fortran 2003 classes furnishing extensible object-oriented utilitarian functionality for large-scale physics simulations on distributed memory supercomputers. This functionality includes physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. This revision -Version 2 of Basics - makes mostly minor additions to functionality and includes some simplifying name changes.
Basic Facts and Figures about the Educational System in Japan.
ERIC Educational Resources Information Center
National Inst. for Educational Research, Tokyo (Japan).
Tables, charts, and graphs convey supporting data that accompany text on various aspects of the Japanese educational system presented in this booklet. There are seven chapters: (1) Fundamental principles of education; (2) Organization of the educational system; (3) Basic statistics of education; (4) Curricula, textbooks, and instructional aids;…
Tighe, Elizabeth L; Schatschneider, Christopher
2016-07-01
The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82%-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. © Hammill Institute on Disabilities 2014.
Basic Understanding of Earth Tunneling by Melting : Volume 1. Basic Physical Principles.
DOT National Transportation Integrated Search
1974-07-01
A novel technique, which employs the melting of rocks and soils as a means of excavating or tunneling while simultaneously generating a glass tunnel lining and/or primary support, was studied. The object of the study was to produce a good basic under...
Train the Trainer. Facilitator Guide Sample. Basic Blueprint Reading (Chapter One).
ERIC Educational Resources Information Center
Saint Louis Community Coll., MO.
This publication consists of three sections: facilitator's guide--train the trainer, facilitator's guide sample--Basic Blueprint Reading (Chapter 1), and participant's guide sample--basic blueprint reading (chapter 1). Section I addresses why the trainer should learn new classroom techniques; lecturing versus facilitating; learning styles…
Basic Casting from A to Z. Student's Instruction Booklet.
ERIC Educational Resources Information Center
Zebco, Tulsa, OK.
A profusely illustrated student instruction booklet contains step-by-step directions and diagrams for learning four basic casting techniques. Separate sections cover basic spin-casting, spinning, bait-casting, and fly-casting. Each section details recommended equipment (reel, rod, line, plug, tackle, lures, leaders, flies), describes specific…
Robb, N
2014-03-01
The basic techniques of conscious sedation have been found to be safe and effective for the management of anxiety in adult dental patients requiring sedation to allow them to undergo dental treatment. There remains great debate within the profession as to the role of the so called advanced sedation techniques. This paper presents a series of nine patients who were managed with advanced sedation techniques where the basic techniques were either inappropriate or had previously failed to provide adequate relief of anxiety. In these cases, had there not been the availability of advanced sedation techniques, the most likely recourse would have been general anaesthesia--a treatment modality that current guidance indicates should not be used where there is an appropriate alternative. The sedation techniques used have provided that appropriate alternative management strategy.
ERIC Educational Resources Information Center
McCabe, Michael; Fafaran, Keita, Ed.
This instructional guide is designed for use by Peace Corps volunteers in teaching basic skills to rural residents of Mali through practical activities on school grounds. Four instructional units provide background information, definitions, illustrated descriptions of procedures, data tables, and suggested exercises for teaching in these areas:…
Basic principles of management for cervical spine trauma.
O'Dowd, J K
2010-03-01
This article reviews the basic principles of management of cervical trauma. The technique and critical importance of careful assessment is described. Instability is defined, and the incidence of a second injury is highlighted. The concept of spinal clearance is discussed. Early reduction and stabilisation techniques are described, and the indications, and approach for surgery reviewed. The importance of the role of post-injury rehabilitation is identified.
Survey of statistical techniques used in validation studies of air pollution prediction models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bornstein, R D; Anderson, S F
1979-03-01
Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.
Basic Radar Altimetry Toolbox: tools to teach altimetry for ocean
NASA Astrophysics Data System (ADS)
Rosmorduc, Vinca; Benveniste, Jerome; Bronner, Emilie; Niemeijer, Sander; Lucas, Bruno Manuel; Dinardo, Salvatore
2013-04-01
The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the next mission to be launched, CryoSat. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. More than 2000 people downloaded it (January 2013), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 and 3. Others are in discussion for the future, including addition of the future Sentinel-3. The Basic Radar Altimetry Toolbox is able: - to read most distributed radar altimetry data, including the one from future missions like Saral, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways, including as an educational tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. Example from education uses will be presented, and feedback from those who used it as such will be most welcome. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/
Petrova, Guenka; Clerfeuille, Fabrice; Vakrilova, Milena; Mitkov, Cvetomir; Poubanne, Yannick
2008-01-01
The objective of this work is to study the possibilities of the tetraclass model for the evaluation of the changes in the consumer satisfaction from the provided pharmacy services during the time. Methods Within the same 4 months period in 2004 and 2006 were questioned at approximately 10 pharmacy consumers per working day. Every consumer evaluated the 34 service elements on a 5 points semantic-differential scale. The technique of the correspondence data analysis was used for the categorisation of the services. Results Most of the services have been categorized as basic ones. For the age group up to 40 years the access to pharmacy became a key element and external aspects became a secondary element in 2006 year. For the group of patients that are using the services of the pharmacy for more than 2 years, availability of phone connection, quality of answers and product prices move from plus to secondary element. The ratio quality/price moves from the group of basic to key services, visibility of the prices and hygiene became basic elements from secondary ones. During the two years period, all the service elements connected with the staff as availability, identification, good looking, confidence, dressing, advices, technical competence, explanation, and time spent with clients remain basic services. The confidentiality of the staff remains always a key element. Conclusion Our study shows that the tetraclass model allows taking more informed managerial decisions in the pharmacies, as well as, is providing information for the concrete area of services and possible measures. In case of a development of a simple statistical program for quick processing of the inquiry data, the method will became applicable and affordable even for small pharmacies. PMID:25147588
Annual statistical report 2008 : based on data from CARE/EC
DOT National Transportation Integrated Search
2008-10-31
This Annual Statistical Report provides the basic characteristics of road accidents in 19 member states of : the European Union for the period 1997-2006, on the basis of data collected and processed in the CARE : database, the Community Road Accident...
Country Education Profiles: Algeria.
ERIC Educational Resources Information Center
International Bureau of Education, Geneva (Switzerland).
One of a series of profiles prepared by the Cooperative Educational Abstracting Service, this brief outline provides basic background information on educational principles, system of administration, structure and organization, curricula, and teacher training in Algeria. Statistics provided by the Unesco Office of Statistics show enrollment at all…
78 FR 23158 - Organization and Delegation of Duties
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-18
... management actions of major significance, such as those relating to changes in basic organization pattern... regard to rulemaking, enforcement, vehicle safety research and statistics and data analysis, provides... Administrator for the National Center for Statistics and Analysis, and the Associate Administrator for Vehicle...
Performance Data Gathering and Representation from Fixed-Size Statistical Data
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Jin, Haoqiang H.; Schmidt, Melisa A.; Kutler, Paul (Technical Monitor)
1997-01-01
The two commonly-used performance data types in the super-computing community, statistics and event traces, are discussed and compared. Statistical data are much more compact but lack the probative power event traces offer. Event traces, on the other hand, are unbounded and can easily fill up the entire file system during program execution. In this paper, we propose an innovative methodology for performance data gathering and representation that offers a middle ground. Two basic ideas are employed: the use of averages to replace recording data for each instance and 'formulae' to represent sequences associated with communication and control flow. The user can trade off tracing overhead, trace data size with data quality incrementally. In other words, the user will be able to limit the amount of trace data collected and, at the same time, carry out some of the analysis event traces offer using space-time views. With the help of a few simple examples, we illustrate the use of these techniques in performance tuning and compare the quality of the traces we collected with event traces. We found that the trace files thus obtained are, indeed, small, bounded and predictable before program execution, and that the quality of the space-time views generated from these statistical data are excellent. Furthermore, experimental results showed that the formulae proposed were able to capture all the sequences associated with 11 of the 15 applications tested. The performance of the formulae can be incrementally improved by allocating more memory at runtime to learn longer sequences.
NASA Astrophysics Data System (ADS)
Stošić, Dušan; Auroux, Aline
Basic principles of calorimetry coupled with other techniques are introduced. These methods are used in heterogeneous catalysis for characterization of acidic, basic and red-ox properties of solid catalysts. Estimation of these features is achieved by monitoring the interaction of various probe molecules with the surface of such materials. Overview of gas phase, as well as liquid phase techniques is given. Special attention is devoted to coupled calorimetry-volumetry method. Furthermore, the influence of different experimental parameters on the results of these techniques is discussed, since it is known that they can significantly influence the evaluation of catalytic properties of investigated materials.
Action Implications in Adult Basic Education Programs.
ERIC Educational Resources Information Center
Ohio State Dept. of Education, Columbus.
Eight articles on adult basic education are presented. The articles adapted from 1971 workshop presentations are: Action Implications for ABE Directors by Alan Knox; ABE Budget Development, by Donald G. Butcher; Competent ABE Instructors, by William D. Dowling; Interview Techniques and Training, by Norman Kagan; Reading: The Basic in Adult Basic…
ERIC Educational Resources Information Center
Hobden, Sally
2014-01-01
Information on the HIV/AIDS epidemic in Southern Africa is often interpreted through a veil of secrecy and shame and, I argue, with flawed understanding of basic statistics. This research determined the levels of statistical literacy evident in 316 future Mathematical Literacy teachers' explanations of the median in the context of HIV/AIDS…
Introduction to Statistics. Learning Packages in the Policy Sciences Series, PS-26. Revised Edition.
ERIC Educational Resources Information Center
Policy Studies Associates, Croton-on-Hudson, NY.
The primary objective of this booklet is to introduce students to basic statistical skills that are useful in the analysis of public policy data. A few, selected statistical methods are presented, and theory is not emphasized. Chapter 1 provides instruction for using tables, bar graphs, bar graphs with grouped data, trend lines, pie diagrams,…
Choi, Jin-Woo; Kim, Kack-Kyun; Lee, Jihyun; Choi, Dong-Ju; Kim, Kyung-Nyun
2017-01-01
In addition to dental education, a system for the evaluation and management of dental licensing and certification is required to meet the growing societal demand for more competent dentists. In this study, the Delphi technique was used to gather opinions from a variety of professionals on the problems of and remedies for the dental license management system in Korea. Delphi surveys were conducted from April 2016 to October 2016 in South Korea. A variety of dental professionals were included and categorized into 3 groups according to their expertise as follows: the basic dentistry group, the clinical dentistry group, and the policy group. The Delphi technique was conducted in 3 rounds of e-mail surveys, each with different questions that probed with increasing depth on the dental license management system. In each successive round, the responses were categorized, scored on a Likert scale, and statistically analyzed. After categorizing the results of the first survey and ranking the results of the second survey using the Delphi technique, regulation by a licensing authority was found to be the most critical issue. This was followed by the license renewal system, continuing education, a tiered licensure system, improvement of foreign license approval, and utilization of retirees, in decreasing order of importance. The third Delphi survey showed a similar ranking, with regulation by a licensing authority being the major concern. Opinions regarding the dental license management system were provided as open-ended responses. The responses of the 3 groups showed statistically significant differences in the scores for the issue of regulation by a licensing authority. After re-grouping into the dentistry group and the policy group, the issue received a significantly higher score in the dentistry group. The quality of dental treatment should be managed to protect patients and dental professionals. For this purpose, the establishment of an independent license regulation authority along with legislative changes is required.
A review of second law techniques applicable to basic thermal science research
NASA Astrophysics Data System (ADS)
Drost, M. Kevin; Zamorski, Joseph R.
1988-11-01
This paper reports the results of a review of second law analysis techniques which can contribute to basic research in the thermal sciences. The review demonstrated that second law analysis has a role in basic thermal science research. Unlike traditional techniques, second law analysis accurately identifies the sources and location of thermodynamic losses. This allows the development of innovative solutions to thermal science problems by directing research to the key technical issues. Two classes of second law techniques were identified as being particularly useful. First, system and component investigations can provide information of the source and nature of irreversibilities on a macroscopic scale. This information will help to identify new research topics and will support the evaluation of current research efforts. Second, the differential approach can provide information on the causes and spatial and temporal distribution of local irreversibilities. This information enhances the understanding of fluid mechanics, thermodynamics, and heat and mass transfer, and may suggest innovative methods for reducing irreversibilities.
Geology orbiter comparison study
NASA Technical Reports Server (NTRS)
Cutts, J. A. J.; Blasius, K. R.; Davis, D. R.; Pang, K. D.; Shreve, D. C.
1977-01-01
Instrument requirements of planetary geology orbiters were examined with the objective of determining the feasibility of applying standard instrument designs to a host of terrestrial targets. Within the basic discipline area of geochemistry, gamma-ray, X-ray fluorescence, and atomic spectroscopy remote sensing techniques were considered. Within the discipline area of geophysics, the complementary techniques of gravimetry and radar were studied. Experiments using these techniques were analyzed for comparison at the Moon, Mercury, Mars and the Galilean satellites. On the basis of these comparative assessments, the adaptability of each sensing technique was judged as a basic technique for many targets, as a single instrument applied to many targets, as a single instrument used in different mission modes, and as an instrument capability for nongeoscience objectives.
Parton, Jason M.
2014-01-01
Objective. To adapt a classroom assessment technique (CAT) from an anthropology course to a diabetes module in a clinical pharmacy skills laboratory and to determine student knowledge retention from baseline. Design. Diabetes item stems, focused on module objectives, replaced anthropology terms. Answer choices, coded to Bloom’s Taxonomy, were expanded to include higher-order thinking. Students completed the online 5-item probe 4 times: prelaboratory lecture, postlaboratory, and at 6 months and 12 months after laboratory. Statistical analyses utilized a single factor, repeated measures design using rank transformations of means with a Mann-Whitney-Wilcoxon test. Assessment. The CAT revealed a significant increase in knowledge from prelaboratory compared to all postlaboratory measurements (p<0.0001). Significant knowledge retention was maintained with basic terms, but declined with complex terms between 6 and 12 months. Conclusion. The anthropology assessment tool was effectively adapted using Bloom’s Taxonomy as a guide and, when used repeatedly, demonstrated knowledge retention. Minimal time was devoted to application of the probe making it an easily adaptable CAT. PMID:25258445
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, W.G.
Scapteriscus vicinus is the most important pest of turf and pasture grasses in Florida. This study develops a method of correlating sample results with true population density and provides the first quantitative information on spatial distribution and movement patterns of mole crickets. Three basic techniques for sampling mole crickets were compared: soil flushes, soil corer, and pitfall trapping. No statistical difference was found between the soil corer and soil flushing. Soil flushing was shown to be more sensitive to changes in population density than pitfall trapping. No technique was effective for sampling adults. Regression analysis provided a means of adjustingmore » for the effects of soil moisture and showed soil temperature to be unimportant in predicting efficiency of flush sampling. Cesium-137 was used to label females for subsequent location underground. Comparison of mean distance to nearest neighbor with the distance predicted by a random distribution model showed that the observed distance in the spring was significantly greater than hypothesized (Student's T-test, p < 0.05). Fall adult nearest neighbor distance was not different than predicted by the random distribution hypothesis.« less
Connected Text Reading and Differences in Text Reading Fluency in Adult Readers
Wallot, Sebastian; Hollis, Geoff; van Rooij, Marieke
2013-01-01
The process of connected text reading has received very little attention in contemporary cognitive psychology. This lack of attention is in parts due to a research tradition that emphasizes the role of basic lexical constituents, which can be studied in isolated words or sentences. However, this lack of attention is in parts also due to the lack of statistical analysis techniques, which accommodate interdependent time series. In this study, we investigate text reading performance with traditional and nonlinear analysis techniques and show how outcomes from multiple analyses can used to create a more detailed picture of the process of text reading. Specifically, we investigate reading performance of groups of literate adult readers that differ in reading fluency during a self-paced text reading task. Our results indicate that classical metrics of reading (such as word frequency) do not capture text reading very well, and that classical measures of reading fluency (such as average reading time) distinguish relatively poorly between participant groups. Nonlinear analyses of distribution tails and reading time fluctuations provide more fine-grained information about the reading process and reading fluency. PMID:23977177
Szabo, J.K.; Fedriani, E.M.; Segovia-Gonzalez, M. M.; Astheimer, L.B.; Hooper, M.J.
2010-01-01
This paper introduces a new technique in ecology to analyze spatial and temporal variability in environmental variables. By using simple statistics, we explore the relations between abiotic and biotic variables that influence animal distributions. However, spatial and temporal variability in rainfall, a key variable in ecological studies, can cause difficulties to any basic model including time evolution. The study was of a landscape scale (three million square kilometers in eastern Australia), mainly over the period of 19982004. We simultaneously considered qualitative spatial (soil and habitat types) and quantitative temporal (rainfall) variables in a Geographical Information System environment. In addition to some techniques commonly used in ecology, we applied a new method, Functional Principal Component Analysis, which proved to be very suitable for this case, as it explained more than 97% of the total variance of the rainfall data, providing us with substitute variables that are easier to manage and are even able to explain rainfall patterns. The main variable came from a habitat classification that showed strong correlations with rainfall values and soil types. ?? 2010 World Scientific Publishing Company.
Regression analysis for solving diagnosis problem of children's health
NASA Astrophysics Data System (ADS)
Cherkashina, Yu A.; Gerget, O. M.
2016-04-01
The paper includes results of scientific researches. These researches are devoted to the application of statistical techniques, namely, regression analysis, to assess the health status of children in the neonatal period based on medical data (hemostatic parameters, parameters of blood tests, the gestational age, vascular-endothelial growth factor) measured at 3-5 days of children's life. In this paper a detailed description of the studied medical data is given. A binary logistic regression procedure is discussed in the paper. Basic results of the research are presented. A classification table of predicted values and factual observed values is shown, the overall percentage of correct recognition is determined. Regression equation coefficients are calculated, the general regression equation is written based on them. Based on the results of logistic regression, ROC analysis was performed, sensitivity and specificity of the model are calculated and ROC curves are constructed. These mathematical techniques allow carrying out diagnostics of health of children providing a high quality of recognition. The results make a significant contribution to the development of evidence-based medicine and have a high practical importance in the professional activity of the author.
Anomaly-based intrusion detection for SCADA systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, D.; Usynin, A.; Hines, J. W.
2006-07-01
Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less
NASA Astrophysics Data System (ADS)
Aydogan, Selen
This dissertation considers the problem of process synthesis and design of life-support systems for manned space missions. A life-support system is a set of technologies to support human life for short and long-term spaceflights, via providing the basic life-support elements, such as oxygen, potable water, and food. The design of the system needs to meet the crewmember demand for the basic life-support elements (products of the system) and it must process the loads generated by the crewmembers. The system is subject to a myriad of uncertainties because most of the technologies involved are still under development. The result is high levels of uncertainties in the estimates of the model parameters, such as recovery rates or process efficiencies. Moreover, due to the high recycle rates within the system, the uncertainties are amplified and propagated within the system, resulting in a complex problem. In this dissertation, two algorithms have been successfully developed to help making design decisions for life-support systems. The algorithms utilize a simulation-based optimization approach that combines a stochastic discrete-event simulation and a deterministic mathematical programming approach to generate multiple, unique realizations of the controlled evolution of the system. The timelines are analyzed using time series data mining techniques and statistical tools to determine the necessary technologies, their deployment schedules and capacities, and the necessary basic life-support element amounts to support crew life and activities for the mission duration.
Schoenthaler, Martin; Avcil, Tuba; Sevcenco, Sabina; Nagele, Udo; Hermann, Thomas E W; Kuehhas, Franklin E; Shariat, Shahrokh F; Frankenschmidt, Alexander; Wetterauer, Ulrich; Miernik, Arkadiusz
2015-01-01
To evaluate the Single-Incision Transumbilical Surgery (SITUS) technique as compared to an established laparoendoscopic single-site surgery (LESS) technique (Single-Port Laparoscopic Surgery, SPLS) and conventional laparoscopy (CLS) in a surgical simulator model. Sixty-three medical students without previous laparoscopic experience were randomly assigned to one of the three groups (SITUS, SPLS and CLS). Subjects were asked to perform five standardized tasks of increasing difficulty adopted from the Fundamentals of Laparoscopic Surgery curriculum. Statistical evaluation included task completion times and accuracy. Overall performances of all tasks (except precision cutting) were significantly faster and of higher accuracy in the CLS and SITUS groups than in the SPLS group (p = 0.004 to p < 0.001). CLS and SITUS groups alone showed no significant difference in performance times and accuracy measurements for all tasks (p = 0.048 to p = 0.989). SITUS proved to be a simple, but highly effective technique to overcome restrictions of SPLS. In a surgical simulator model, novices were able to achieve task performances comparable to CLS and did significantly better than using a port-assisted LESS technique such as SPLS. The demonstrated advantages of SITUS may be attributed to a preservation of the basic principles of conventional laparoscopy, such as the use of straight instruments and an adequate degree of triangulation.
Bag-of-features approach for improvement of lung tissue classification in diffuse lung disease
NASA Astrophysics Data System (ADS)
Kato, Noriji; Fukui, Motofumi; Isozaki, Takashi
2009-02-01
Many automated techniques have been proposed to classify diffuse lung disease patterns. Most of the techniques utilize texture analysis approaches with second and higher order statistics, and show successful classification result among various lung tissue patterns. However, the approaches do not work well for the patterns with inhomogeneous texture distribution within a region of interest (ROI), such as reticular and honeycombing patterns, because the statistics can only capture averaged feature over the ROI. In this work, we have introduced the bag-of-features approach to overcome this difficulty. In the approach, texture images are represented as histograms or distributions of a few basic primitives, which are obtained by clustering local image features. The intensity descriptor and the Scale Invariant Feature Transformation (SIFT) descriptor are utilized to extract the local features, which have significant discriminatory power due to their specificity to a particular image class. In contrast, the drawback of the local features is lack of invariance under translation and rotation. We improved the invariance by sampling many local regions so that the distribution of the local features is unchanged. We evaluated the performance of our system in the classification task with 5 image classes (ground glass, reticular, honeycombing, emphysema, and normal) using 1109 ROIs from 211 patients. Our system achieved high classification accuracy of 92.8%, which is superior to that of the conventional system with the gray level co-occurrence matrix (GLCM) feature especially for inhomogeneous texture patterns.
Choi, Kihwan; Li, Ruijiang; Nam, Haewon; Xing, Lei
2014-06-21
As a solution to iterative CT image reconstruction, first-order methods are prominent for the large-scale capability and the fast convergence rate [Formula: see text]. In practice, the CT system matrix with a large condition number may lead to slow convergence speed despite the theoretically promising upper bound. The aim of this study is to develop a Fourier-based scaling technique to enhance the convergence speed of first-order methods applied to CT image reconstruction. Instead of working in the projection domain, we transform the projection data and construct a data fidelity model in Fourier space. Inspired by the filtered backprojection formalism, the data are appropriately weighted in Fourier space. We formulate an optimization problem based on weighted least-squares in the Fourier space and total-variation (TV) regularization in image space for parallel-beam, fan-beam and cone-beam CT geometry. To achieve the maximum computational speed, the optimization problem is solved using a fast iterative shrinkage-thresholding algorithm with backtracking line search and GPU implementation of projection/backprojection. The performance of the proposed algorithm is demonstrated through a series of digital simulation and experimental phantom studies. The results are compared with the existing TV regularized techniques based on statistics-based weighted least-squares as well as basic algebraic reconstruction technique. The proposed Fourier-based compressed sensing (CS) method significantly improves both the image quality and the convergence rate compared to the existing CS techniques.
Use of chick embryo in screening for teratogenicity.
Kotwani, A
1998-04-01
A teratology screening system would detect agents hazardous to the conceptus before they can perturb embryonic development in humans. The back log of untested chemicals and the rate at which new substances enter the market exceed the developmental effects testing by standard in vivo method. Thus, cheaper, quicker in vitro systems afford a unique opportunity for investigating the direct interaction of substances with developing morphogenetic system (MGSs), since maternal influences are excluded. As a carrier of a complete set of MGSs, the chick embryo in ovo manifests an advantage over those in vitro systems that employ isolated embryos or embryonic tissues that have only limited survival. Under controlled experimental conditions including standardization of subjects, administration technique and mode of evaluation, according to the basic principles of teratology, the chick embryo test is demonstrated to be reliable and to afford quantifiable end points for evaluation. Individual compounds, mixtures of compounds and against and antagonist can easily be administered and tested. The chick embryo possesses its own basic enzyme-catalyzed drug-transformation capacity and moreover, it can be used for screening specific human metabolites. Different newer techniques e.g. chick embryotoxicity screening test (CHEST), Chick embryo blastoderm model etc are described in detail. Chick embryo fulfills all the criteria which a test should have at a lower level of tier system in teratological studies i.e. modest laboratory equipment, moderate skill, minimal expenditure of time and money, ease of accessibility of embryo, known embryological development, possibility of experimenting on a large scale for statistically valid results and whole animals are also not required.
Linden, Ariel; Adams, John L
2011-12-01
Often, when conducting programme evaluations or studying the effects of policy changes, researchers may only have access to aggregated time series data, presented as observations spanning both the pre- and post-intervention periods. The most basic analytic model using these data requires only a single group and models the intervention effect using repeated measurements of the dependent variable. This model controls for regression to the mean and is likely to detect a treatment effect if it is sufficiently large. However, many potential sources of bias still remain. Adding one or more control groups to this model could strengthen causal inference if the groups are comparable on pre-intervention covariates and level and trend of the dependent variable. If this condition is not met, the validity of the study findings could be called into question. In this paper we describe a propensity score-based weighted regression model, which overcomes these limitations by weighting the control groups to represent the average outcome that the treatment group would have exhibited in the absence of the intervention. We illustrate this technique studying cigarette sales in California before and after the passage of Proposition 99 in California in 1989. While our results were similar to those of the Synthetic Control method, the weighting approach has the advantage of being technically less complicated, rooted in regression techniques familiar to most researchers, easy to implement using any basic statistical software, may accommodate any number of treatment units, and allows for greater flexibility in the choice of treatment effect estimators. © 2010 Blackwell Publishing Ltd.
Applying dynamic Bayesian networks to perturbed gene expression data.
Dojer, Norbert; Gambin, Anna; Mizera, Andrzej; Wilczyński, Bartek; Tiuryn, Jerzy
2006-05-08
A central goal of molecular biology is to understand the regulatory mechanisms of gene transcription and protein synthesis. Because of their solid basis in statistics, allowing to deal with the stochastic aspects of gene expressions and noisy measurements in a natural way, Bayesian networks appear attractive in the field of inferring gene interactions structure from microarray experiments data. However, the basic formalism has some disadvantages, e.g. it is sometimes hard to distinguish between the origin and the target of an interaction. Two kinds of microarray experiments yield data particularly rich in information regarding the direction of interactions: time series and perturbation experiments. In order to correctly handle them, the basic formalism must be modified. For example, dynamic Bayesian networks (DBN) apply to time series microarray data. To our knowledge the DBN technique has not been applied in the context of perturbation experiments. We extend the framework of dynamic Bayesian networks in order to incorporate perturbations. Moreover, an exact algorithm for inferring an optimal network is proposed and a discretization method specialized for time series data from perturbation experiments is introduced. We apply our procedure to realistic simulations data. The results are compared with those obtained by standard DBN learning techniques. Moreover, the advantages of using exact learning algorithm instead of heuristic methods are analyzed. We show that the quality of inferred networks dramatically improves when using data from perturbation experiments. We also conclude that the exact algorithm should be used when it is possible, i.e. when considered set of genes is small enough.
75 FR 33203 - Funding Formula for Grants to States
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-11
... as Social Security numbers, birth dates, and medical data. Docket: To read or download submissions or... Local Area Unemployment Statistics (LAUS), both of which are compiled by DOL's Bureau of Labor Statistics. Specifies how each State's basic JVSG allocation is calculated. Identifies the procedures...
Statistical Considerations for Establishing CBTE Cut-Off Scores.
ERIC Educational Resources Information Center
Trzasko, Joseph A.
This report gives the basic definition and purpose of competency-based teacher education (CBTE) cut-off scores. It describes the basic characteristics of CBTE as a yes-no dichotomous decision regarding the presence of a specific ability or knowledge, which necesitates the establishment of a cut-off point to designate competency vs. incompetency on…
ADULT BASIC EDUCATION. PROGRAM SUMMARY.
ERIC Educational Resources Information Center
Office of Education (DHEW), Washington, DC.
A BRIEF DESCRIPTION IS GIVEN OF THE FEDERAL ADULT BASIC EDUCATION PROGRAM, UNDER THE ADULT EDUCATION ACT OF 1966, AT THE NATIONAL AND STATE LEVELS (INCLUDING PUERTO RICO, GUAM, AMERICAN SAMOA, AND THE VIRGIN ISLANDS) AS PROVIDED BY STATE EDUCATION AGENCIES. STATISTICS FOR FISCAL YEARS 1965 AND 1966, AND ESTIMATES FOR FISCAL YEAR 1967, INDICATE…
Action Research of Computer-Assisted-Remediation of Basic Research Concepts.
ERIC Educational Resources Information Center
Packard, Abbot L.; And Others
This study investigated the possibility of creating a computer-assisted remediation program to assist students having difficulties in basic college research and statistics courses. A team approach involving instructors and students drove the research into and creation of the computer program. The effect of student use was reviewed by looking at…
Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.
ERIC Educational Resources Information Center
Blakeslee, David W.; And Others
This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-20
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-D-0419... who conduct studies using active controls and have a basic understanding of statistical principles... clinical investigators who conduct studies using active controls and have a basic understanding of...
Basic principles of management for cervical spine trauma
2009-01-01
This article reviews the basic principles of management of cervical trauma. The technique and critical importance of careful assessment is described. Instability is defined, and the incidence of a second injury is highlighted. The concept of spinal clearance is discussed. Early reduction and stabilisation techniques are described, and the indications, and approach for surgery reviewed. The importance of the role of post-injury rehabilitation is identified. PMID:19701655
Evaluation of Noncontact Power Collection Techniques
DOT National Transportation Integrated Search
1972-07-01
An evaluation is made of four basic noncontacting techniques of power collection which have possible applicability in future high speed ground transportation systems. The techniques considered include the electric arc, magnetic induction, electrostat...
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
WE-A-304-01: Strategies and Technologies for Cranial Radiosurgery Planning: MLC-Based Linac
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, G.
2015-06-15
The high fractional doses, stringent requirements for accuracy and precision, and surgical perspective characteristic of intracranial radiosurgery create considerations for treatment planning which are distinct from most other radiotherapy procedures. This session will introduce treatment planning techniques specific to two popular intracranial SRS modalities: Gamma Knife and MLC-based Linac. The basic treatment delivery characteristics of each device will be reviewed with a focus on how those characteristics determine the paradigm used for treatment planning. Basic techniques for treatment planning will be discussed, including considerations such as isodose selection, target and organ-at-risk definition, quality indices, and protection of critical structures. Futuremore » directions for SRS treatment planning will also be discussed. Learning Objectives: Introduce the basic physical principles of intracranial radiosurgery and how they are realized in the treatment planning paradigms for Gamma Knife and Linac radiosurgery. Demonstrate basic treatment planning techniques. Discuss metrics for evaluating SRS treatment plan quality. Discuss recent and future advances in SRS treatment planning. D. Schlesinger receives research support from Elekta, AB.« less
WE-A-304-02: Strategies and Technologies for Cranial Radiosurgery Planning: Gamma Knife
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlesinger, D.
2015-06-15
The high fractional doses, stringent requirements for accuracy and precision, and surgical perspective characteristic of intracranial radiosurgery create considerations for treatment planning which are distinct from most other radiotherapy procedures. This session will introduce treatment planning techniques specific to two popular intracranial SRS modalities: Gamma Knife and MLC-based Linac. The basic treatment delivery characteristics of each device will be reviewed with a focus on how those characteristics determine the paradigm used for treatment planning. Basic techniques for treatment planning will be discussed, including considerations such as isodose selection, target and organ-at-risk definition, quality indices, and protection of critical structures. Futuremore » directions for SRS treatment planning will also be discussed. Learning Objectives: Introduce the basic physical principles of intracranial radiosurgery and how they are realized in the treatment planning paradigms for Gamma Knife and Linac radiosurgery. Demonstrate basic treatment planning techniques. Discuss metrics for evaluating SRS treatment plan quality. Discuss recent and future advances in SRS treatment planning. D. Schlesinger receives research support from Elekta, AB.« less
Peers versus professional training of basic life support in Syria: a randomized controlled trial.
Abbas, Fatima; Sawaf, Bisher; Hanafi, Ibrahem; Hajeer, Mohammad Younis; Zakaria, Mhd Ismael; Abbas, Wafaa; Alabdeh, Fadi; Ibrahim, Nazir
2018-06-18
Peer training has been identified as a useful tool for delivering undergraduate training in basic life support (BLS) which is fundamental as an initial response in cases of emergency. This study aimed to (1) Evaluate the efficacy of peer-led model in basic life support training among medical students in their first three years of study, compared to professional-led training and (2) To assess the efficacy of the course program and students' satisfaction of peer-led training. A randomized controlled trial with blinded assessors was conducted on 72 medical students from the pre-clinical years (1st to 3rd years in Syria) at Syrian Private University. Students were randomly assigned to peer-led or to professional-led training group for one-day-course of basic life support skills. Sixty-four students who underwent checklist based assessment using objective structured clinical examination design (OSCE) (practical assessment of BLS skills) and answered BLS knowledge checkpoint-questionnaire were included in the analysis. There was no statistically significant difference between the two groups in delivering BLS skills to medical students in practical (P = 0.850) and BLS knowledge questionnaire outcomes (P = 0.900). Both groups showed statistically significant improvement from pre- to post-course assessment with significant statistical difference in both practical skills and theoretical knowledge (P-Value < 0.001). Students were satisfied with the peer model of training. Peer-led training of basic life support for medical students was beneficial and it provided a quality of education which was as effective as training conducted by professionals. This method is applicable and desirable especially in poor-resource countries and in crisis situation.
ERIC Educational Resources Information Center
Owoh, Titus M.
2016-01-01
This study sought to find out the relationship between students perception of their teacher effectiveness and academic achievement in Basic Technology. Teacher's personality, teaching techniques/classroom management strategy and appearance, all integrate to make for teacher effectiveness. To carry out this research, two research questions and one…
Research-Based Reading Instruction in an Adult Basic Education Program
ERIC Educational Resources Information Center
Perin, Dolores; Greenberg, Daphne
2007-01-01
There is a growing emphasis in adult basic education on research-based reading instruction. Using Kruidenier's (2002) framework of principles and trends, we describe research-based techniques found during a visit to an adult basic education program. We also describe how the program moved to research-based instruction, and the factors that seem…
Survival Skills: A Basic Skills Program.
ERIC Educational Resources Information Center
Mahoney, Don
The guide describes an approach designed to promote the basic skills of hearing impaired students Basic or survival skills are identified which cover the student's daily functioning at home, school, and in the community. The guide is aimed at the 10-15 year old hearing impaired student, but techniques are expected to be applicable to both…
An adaptive approach to the dynamic allocation of buffer storage. M.S. Thesis
NASA Technical Reports Server (NTRS)
Crooke, S. C.
1970-01-01
Several strategies for the dynamic allocation of buffer storage are simulated and compared. The basic algorithms investigated, using actual statistics observed in the Univac 1108 EXEC 8 System, include the buddy method and the first-fit method. Modifications are made to the basic methods in an effort to improve and to measure allocation performance. A simulation model of an adaptive strategy is developed which permits interchanging the two different methods, the buddy and the first-fit methods with some modifications. Using an adaptive strategy, each method may be employed in the statistical environment in which its performance is superior to the other method.
Ultrasound Dopplerography of abdomen pathology using statistical computer programs
NASA Astrophysics Data System (ADS)
Dmitrieva, Irina V.; Arakelian, Sergei M.; Wapota, Alberto R. W.
1998-04-01
The modern ultrasound dopplerography give us the big possibilities in investigation of gemodynamical changes in all stages of abdomen pathology. Many of researches devoted to using of noninvasive methods in practical medicine. Now ultrasound Dopplerography is one of the basic one. We investigated 250 patients from 30 to 77 ages, including 149 men and 101 women. The basic diagnosis of all patients was the Ischaemic Pancreatitis. The Second diagnoses of pathology were the Ischaemic Disease of Heart, Gypertension, Atherosclerosis, Diabet, Vascular Disease of Extremities. We researched the abdominal aorta and her branches: Arteria Mesenterica Superior (AMS), truncus coeliacus (TC), arteria hepatica communis (AHC), arteria lienalis (AL). For investigation we use the following equipment: ACUSON 128 XP/10c, BIOMEDIC, GENERAL ELECTRIC (USA, Japan). We analyzed the following componetns of gemodynamical changes of abdominal vessels: index of pulsation, index of resistance, ratio of systol-dystol, speed of blood circulation. Statistical program included the following one: 'basic statistic's,' 'analytic program.' In conclusion we determined that the all gemodynamical components of abdominal vessels had considerable changes in abdominal ischaemia than in normal situation. Using the computer's program for definition degree of gemodynamical changes, we can recommend the individual plan of diagnostical and treatment program.
Resilience Among Students at the Basic Enlisted Submarine School
2016-12-01
reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well
A Simple Statistical Thermodynamics Experiment
ERIC Educational Resources Information Center
LoPresto, Michael C.
2010-01-01
Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…
76 FR 41756 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-15
... materials and supplies used in production. The economic census will produce basic statistics by kind of business on number of establishments, sales, payroll, employment, inventories, and operating expenses. It also will yield a variety of subject statistics, including sales by product line; sales by class of...
Vetter, Thomas R
2017-11-01
Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
`New insight into statistical hydrology' preface to the special issue
NASA Astrophysics Data System (ADS)
Kochanek, Krzysztof
2018-04-01
Statistical methods are still the basic tool for investigating random, extreme events occurring in hydrosphere. On 21-22 September 2017, in Warsaw (Poland) the international workshop of the Statistical Hydrology (StaHy) 2017 took place under the auspices of the International Association of Hydrological Sciences. The authors of the presentations proposed to publish their research results in the Special Issue of the Acta Geophysica-`New Insight into Statistical Hydrology'. Five papers were selected for publication, touching on the most crucial issues of statistical methodology in hydrology.
NASA Technical Reports Server (NTRS)
1984-01-01
Three mesoscale sounding data sets from the VISSR Atmospheric Sounder (VAS) produced using different retrieval techniques were evaluated of corresponding ground truth rawinsonde data for 6-7 March 1982. Mean, standard deviations, and RMS differences between the satellite and rawinsonde parameters were calculated over gridded fields in central Texas and Oklahoma. Large differences exist between each satellite data set and the ground truth data. Biases in the satellite temperature and moisture profiles seem extremely dependent upon the three dimensional structure of the atmosphere and range from 1 deg to 3 deg C for temperature and 3 deg to 6 deg C for dewpoint temperature. Atmospheric gradients of basic and derived parameters determined from the VAS data sets produced an adequate representation of the mesoscale environment but their magnitudes were often reduced by 30 to 50%.
Hathaway, John C.
1971-01-01
The purpose of the data file presented below is twofold: the first purpose is to make available in printed form the basic data relating to the samples collected as part of the joint U.S. Geological Survey - Woods Hole Oceanographic Institution program of study of the Atlantic continental margin of the United States; the second purpose is to maintain these data in a form that is easily retrievable by modern computer methods. With the data in such form, repeate manual transcription for statistical or similar mathematical treatment becomes unnecessary. Manual plotting of information or derivatives from the information may also be eliminated. Not only is handling of data by the computer considerably faster than manual techniques, but a fruitful source of errors, transcription mistakes, is eliminated.
Near Earth Asteroid Characteristics for Asteroid Threat Assessment
NASA Technical Reports Server (NTRS)
Dotson, Jessie
2015-01-01
Information about the physical characteristics of Near Earth Asteroids (NEAs) is needed to model behavior during atmospheric entry, to assess the risk of an impact, and to model possible mitigation techniques. The intrinsic properties of interest to entry and mitigation modelers, however, rarely are directly measureable. Instead we measure other properties and infer the intrinsic physical properties, so determining the complete set of characteristics of interest is far from straightforward. In addition, for the majority of NEAs, only the basic measurements exist so often properties must be inferred from statistics of the population of more completely characterized objects. We will provide an assessment of the current state of knowledge about the physical characteristics of importance to asteroid threat assessment. In addition, an ongoing effort to collate NEA characteristics into a readily accessible database for use by the planetary defense community will be discussed.
CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.
ERIC Educational Resources Information Center
Shermis, Mark D.; Albert, Susan L.
A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…
Laser interferometric high-precision angle monitor for JASMINE
NASA Astrophysics Data System (ADS)
Niwa, Yoshito; Arai, Koji; Sakagami, Masaaki; Gouda, Naoteru; Kobayashi, Yukiyasu; Yamada, Yoshiyuki; Yano, Taihei
2006-06-01
The JASMINE instrument uses a beam combiner to observe two different fields of view separated by 99.5 degrees simultaneously. This angle is so-called basic angle. The basic angle of JASMINE should be stabilized and fluctuations of the basic angle should be monitored with the accuracy of 10 microarcsec in root-mean-square over the satellite revolution period of 5 hours. For this purpose, a high-precision interferometric laser metrogy system is employed. One of the available techniques for measuring the fluctuations of the basic angle is a method known as the wave front sensing using a Fabry-Perot type laser interferometer. This technique is to detect fluctuations of the basic angle as displacement of optical axis in the Fabry-Perot cavity. One of the advantages of the technique is that the sensor is made to be sensitive only to the relative fluctuations of the basic angle which the JASMINE wants to know and to be insensitive to the common one; in order to make the optical axis displacement caused by relative motion enhanced the Fabry-Perot cavity is formed by two mirrors which have long radius of curvature. To verify the principle of this idea, the experiment was performed using a 0.1m-length Fabry-Perot cavity with the mirror curvature of 20m. The mirrors of the cavity were artificially actuated in either relative way or common way and the resultant outputs from the sensor were compared.
NASA Technical Reports Server (NTRS)
Park, Steve
1990-01-01
A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.
NASA Astrophysics Data System (ADS)
Joshi, D. M.
2017-09-01
Cryogenic technology is used for liquefaction of many gases and it has several applications in food process engineering. Temperatures below 123 K are considered to be in the field of cryogenics. Extreme low temperatures are a basic need for many industrial processes and have several applications, such as superconductivity of magnets, space, medicine and gas industries. Several methods can be used to obtain the low temperatures required for liquefaction of gases. The process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure, which is below the critical pressure, is the basic liquefaction process. Different cryogenic cycle configurations are designed for getting the liquefied form of gases at different temperatures. Each of the cryogenic cycles like Linde cycle, Claude cycle, Kapitza cycle or modified Claude cycle has its own advantages and disadvantages. The placement of heat exchangers, Joule-Thompson valve and turboexpander decides the configuration of a cryogenic cycle. Each configuration has its own efficiency according to the application. Here, a nitrogen liquefaction plant is used for the analysis purpose. The process modeling tool ASPEN HYSYS can provide a software simulation approach before the actual implementation of the plant in the field. This paper presents the simulation and statistical analysis of the Claude cycle with the process modeling tool ASPEN HYSYS. It covers the technique used to optimize the liquefaction of the plant. The simulation results so obtained can be used as a reference for the design and optimization of the nitrogen liquefaction plant. Efficient liquefaction will give the best performance and productivity to the plant.
Integrated analysis of remote sensing products from basic geological surveys. [Brazil
NASA Technical Reports Server (NTRS)
Dasilvafagundesfilho, E. (Principal Investigator)
1984-01-01
Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.
Basic techniques in mammalian cell tissue culture.
Phelan, Katy; May, Kristin M
2015-03-02
Cultured mammalian cells are used extensively in cell biology studies. It requires a number of special skills in order to be able to preserve the structure, function, behavior, and biology of the cells in culture. This unit describes the basic skills required to maintain and preserve cell cultures: maintaining aseptic technique, preparing media with the appropriate characteristics, passaging, freezing and storage, recovering frozen stocks, and counting viable cells. Copyright © 2015 John Wiley & Sons, Inc.
Noh, Wonjung; Lim, Ji Young
2015-06-01
The purpose of this study was to identify the financial management educational needs of nurses in order to development an educational program to strengthen their financial management competencies. Data were collected from two focus groups using the nominal group technique. The study consisted of three steps: a literature review, focus group discussion using the nominal group technique, and data synthesis. After analyzing the results, nine key components were selected: corporate management and accounting, introduction to financial management in hospitals, basic structure of accounting, basics of hospital accounting, basics of financial statements, understanding the accounts of financial statements, advanced analysis of financial statements, application of financial management, and capital financing of hospitals. The present findings can be used to develop a financial management education program to strengthen the financial management competencies of nurses. Copyright © 2015. Published by Elsevier B.V.
Moshtagh-Khorasani, Majid; Akbarzadeh-T, Mohammad-R; Jahangiri, Nader; Khoobdel, Mehdi
2009-01-01
BACKGROUND: Aphasia diagnosis is particularly challenging due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. METHODS: Fuzzy probability is proposed here as the basic framework for handling the uncertainties in medical diagnosis and particularly aphasia diagnosis. To efficiently construct this fuzzy probabilistic mapping, statistical analysis is performed that constructs input membership functions as well as determines an effective set of input features. RESULTS: Considering the high sensitivity of performance measures to different distribution of testing/training sets, a statistical t-test of significance is applied to compare fuzzy approach results with NN results as well as author's earlier work using fuzzy logic. The proposed fuzzy probability estimator approach clearly provides better diagnosis for both classes of data sets. Specifically, for the first and second type of fuzzy probability classifiers, i.e. spontaneous speech and comprehensive model, P-values are 2.24E-08 and 0.0059, respectively, strongly rejecting the null hypothesis. CONCLUSIONS: The technique is applied and compared on both comprehensive and spontaneous speech test data for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. Statistical analysis confirms that the proposed approach can significantly improve accuracy using fewer Aphasia features. PMID:21772867
The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis
ERIC Educational Resources Information Center
Buri, Olga Elizabeth Minchala; Stefos, Efstathios
2017-01-01
The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…
Improving Attendance and Punctuality of FE Basic Skill Students through an Innovative Scheme
ERIC Educational Resources Information Center
Ade-Ojo, Gordon O.
2005-01-01
This paper reports the findings of a study set up to establish the impact of a particular scheme on the attendance and punctuality performance of a group of Basic Skills learners against the backdrop of various theoretical postulations on managing undesirable behavior. Data collected on learners' performance was subjected to statistical analysis…
ERIC Educational Resources Information Center
Applied Management Sciences, Inc., Silver Spring, MD.
The amount of misreporting of Veterans Administration (VA) benefits was assessed, along with the impact of misreporting on the Basic Educational Opportunity Grant (BEOG) program. Accurate financial information is need to determine appropriate awards. The analysis revealed: over 97% of VA beneficiaries misreported benefits; the total net loss to…
ERIC Educational Resources Information Center
Yingxiu, Yang
2006-01-01
Using statistical data on the implementing conditions of China's educational expenditure published by the state, this paper studies the Gini coefficient of the budget educational public expenditure per student in order to examine the concentration degree of the educational expenditure for China's basic education and analyze its balanced…
Ernest J. Gebhart
1980-01-01
Other members of this panel are going to reveal the basic statistics about the coal strip mining industry in Ohio so I will confine my remarks to the revegetation of the spoil banks. So it doesn't appear that Ohio confined its tree planting efforts to spoil banks alone, I will rely on a few statistics.
Idaho State University Statistical Portrait, Academic Year 1998-1999.
ERIC Educational Resources Information Center
Idaho State Univ., Pocatello. Office of Institutional Research.
This report provides basic statistical data for Idaho State University, and includes both point-of-time data as well as trend data. The information is divided into sections emphasizing students, programs, faculty and staff, finances, and physical facilities. Student data includes enrollment, geographical distribution, student/faculty ratios,…
Statistical Report. Fiscal Year 1995: September 1, 1994 - August 31, 1995.
ERIC Educational Resources Information Center
Texas Higher Education Coordinating Board, Austin.
This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1995. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1990-94 headcount data; headcount by classification, ethnic origin,…
Statistical Report. Fiscal Year 1994: September 1, 1993 - August 31, 1994.
ERIC Educational Resources Information Center
Texas Higher Education Coordinating Board, Austin.
This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1994. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1989-93 headcount data; headcount by classification, ethnic origin,…
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2011 CFR
2011-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2013 CFR
2013-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2014 CFR
2014-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.
Code of Federal Regulations, 2012 CFR
2012-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...
Theoretical Frameworks for Math Fact Fluency
ERIC Educational Resources Information Center
Arnold, Katherine
2012-01-01
Recent education statistics indicate persistent low math scores for our nation's students. This drop in math proficiency includes deficits in basic number sense and automaticity of math facts. The decrease has been recorded across all grade levels with the elementary levels showing the greatest loss (National Center for Education Statistics,…
Basic Statistical Concepts and Methods for Earth Scientists
Olea, Ricardo A.
2008-01-01
INTRODUCTION Statistics is the science of collecting, analyzing, interpreting, modeling, and displaying masses of numerical data primarily for the characterization and understanding of incompletely known systems. Over the years, these objectives have lead to a fair amount of analytical work to achieve, substantiate, and guide descriptions and inferences.
NASA Astrophysics Data System (ADS)
Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.
2018-01-01
Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.
Basic Radar Altimetry Toolbox: Tools and Tutorial To Use Radar Altimetry For Cryosphere
NASA Astrophysics Data System (ADS)
Benveniste, J. J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.
2010-12-01
Radar altimetry is very much a technique expanding its applications. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious, especially for new Altimetry data products users. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them, including teachers and professors. Users' feedback, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are under development, some are in discussion for the future. Data use cases on cryosphere applications will be presented. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Contributions of Basic Sciences to Science of Education. Studies in Educational Administration.
ERIC Educational Resources Information Center
Lall, Bernard M.
The science of education has been influenced by the basic sciences to the extent that educational research now has been able to modernize its approach by accepting and using the basic scientific methodology and experimental techniques. Using primarily the same steps of scientific investigations, education today holds a place of much greater esteem…
Using a Thyroid Case Study and Error Plausibility to Introduce Basic Lab Skills
ERIC Educational Resources Information Center
Browning, Samantha; Urschler, Margaret; Meidl, Katherine; Peculis, Brenda; Milanick, Mark
2017-01-01
We describe a 3-hour session that provides students with the opportunity to review basic lab concepts and important techniques using real life scenarios. We began with two separate student-engaged discussions to remind/reinforce some basic concepts in physiology and review calculations with respect to chemical compounds. This was followed by…
Automatic 2D and 3D segmentation of liver from Computerised Tomography
NASA Astrophysics Data System (ADS)
Evans, Alun
As part of the diagnosis of liver disease, a Computerised Tomography (CT) scan is taken of the patient, which the clinician then uses for assistance in determining the presence and extent of the disease. This thesis presents the background, methodology, results and future work of a project that employs automated methods to segment liver tissue. The clinical motivation behind this work is the desire to facilitate the diagnosis of liver disease such as cirrhosis or cancer, assist in volume determination for liver transplantation, and possibly assist in measuring the effect of any treatment given to the liver. Previous attempts at automatic segmentation of liver tissue have relied on 2D, low-level segmentation techniques, such as thresholding and mathematical morphology, to obtain the basic liver structure. The derived boundary can then be smoothed or refined using more advanced methods. The 2D results presented in this thesis improve greatly on this previous work by using a topology adaptive active contour model to accurately segment liver tissue from CT images. The use of conventional snakes for liver segmentation is difficult due to the presence of other organs closely surrounding the liver this new technique avoids this problem by adding an inflationary force to the basic snake equation, and initialising the snake inside the liver. The concepts underlying the 2D technique are extended to 3D, and results of full 3D segmentation of the liver are presented. The 3D technique makes use of an inflationary active surface model which is adaptively reparameterised, according to its size and local curvature, in order that it may more accurately segment the organ. Statistical analysis of the accuracy of the segmentation is presented for 18 healthy liver datasets, and results of the segmentation of unhealthy livers are also shown. The novel work developed during the course of this project has possibilities for use in other areas of medical imaging research, for example the segmentation of internal liver structures, and the segmentation and classification of unhealthy tissue. The possibilities of this future work are discussed towards the end of the report.
Rebuilding Government Legitimacy in Post-conflict Societies: Case Studies of Nepal and Afghanistan
2015-09-09
administered via the verbal scales due to reduced time spent explaining the visual show cards. Statistical results corresponded with observations from...a three-step strategy for dealing with item non-response. First, basic descriptive statistics are calculated to determine the extent of item...descriptive statistics for all items in the survey), however this section of the report highlights just some of the findings. Thus, the results
Biostatistical and medical statistics graduate education
2014-01-01
The development of graduate education in biostatistics and medical statistics is discussed in the context of training within a medical center setting. The need for medical researchers to employ a wide variety of statistical designs in clinical, genetic, basic science and translational settings justifies the ongoing integration of biostatistical training into medical center educational settings and informs its content. The integration of large data issues are a challenge. PMID:24472088
Use of satellite images in the evaluation of farmlands. [in Mexico
NASA Technical Reports Server (NTRS)
Lozano H., A. E.
1978-01-01
Remote sensing techniques in the evaluation of farmland in Mexico are discussed. Electronic analysis techniques and photointerpretation techniques are analyzed. Characteristics of the basic crops in Mexico as related to remote sensing are described.
Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763
Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.
Gambling market and individual patterns of gambling in Germany.
Albers, N; Hübl, L
1997-01-01
In this paper individual patterns of gambling in Germany are estimated for the first time. The probit technique is used to test the influence of a set of individual characteristics on the probability of participating in each of the various legal games. A sample size of 1,586 adults collected for the pool of German lotteries provides a reliable set of data. All disaggregated estimations of participation are statistically significant at least at the 5 percent level. The basic findings suggest that gambling is a widespread normal (superior) consumption good because gambling participation tends to rise with income. Moreover, no demand anomaly can be found to justify assessing gambling as a social demerit. Only the participation in gaming machines is higher for younger, unemployed and less educated adults. While a moral evaluation of gambling is beyond the scope of this paper, the legislator's preference for a highly taxed state monopoly in gambling markets is to be rejected, at least for Germany. Additional statistical findings suggest distinct consumer perceptions of the characteristics of the various games and may be used for market segmentation. The paper starts with a descriptive introduction to the German gambling market.
A review of the different techniques for solid surface acid-base characterization.
Sun, Chenhang; Berg, John C
2003-09-18
In this work, various techniques for solid surface acid-base (AB) characterization are reviewed. Different techniques employ different scales to rank acid-base properties. Based on the results from literature and the authors' own investigations for mineral oxides, these scales are compared. The comparison shows that Isoelectric Point (IEP), the most commonly used AB scale, is not a description of the absolute basicity or acidity of a surface, but a description of their relative strength. That is, a high IEP surface shows more basic functionality comparing with its acidic functionality, whereas a low IEP surface shows less basic functionality comparing with its acidic functionality. The choice of technique and scale for AB characterization depends on the specific application. For the cases in which the overall AB property is of interest, IEP (by electrokinetic titration) and H(0,max) (by indicator dye adsorption) are appropriate. For the cases in which the absolute AB property is of interest such as in the study of adhesion, it is more pertinent to use chemical shift (by XPS) and the heat of adsorption of probe gases (by calorimetry or IGC).
Comparison of holographic setups used in heat and mass transfer measurement
NASA Astrophysics Data System (ADS)
Doleček, R.; Psota, P.; Lédl, V.; Vít, T.; Kopecký, V.
2014-03-01
The authors of the paper deal with measurement of heat and mass transfer for several years and they have frequently used few techniqes for measurement of refractive index distribution based on holographic interferometry. Some of the well known techniques have been modified some and some new ones developped. Every technique could be applied with success in different type of meassurement and obviously every one has set of properties making them unique. We decided to digest few different basic techniques and describe its properties in this paper with the aim to help the reader select the proper one for their measurement. The list of techniques and its properties is not comprehensive but schould serve as a basic orientation in the field.
Arthropod surveillance programs: Basic components, strategies, and analysis
USDA-ARS?s Scientific Manuscript database
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthro...
Views of medical students: what, when and how do they want statistics taught?
Fielding, S; Poobalan, A; Prescott, G J; Marais, D; Aucott, L
2015-11-01
A key skill for a practising clinician is being able to do research, understand the statistical analyses and interpret results in the medical literature. Basic statistics has become essential within medical education, but when, what and in which format is uncertain. To inform curriculum design/development we undertook a quantitative survey of fifth year medical students and followed them up with a series of focus groups to obtain their opinions as to what statistics teaching they want, when and how. A total of 145 students undertook the survey and five focus groups were held with between 3 and 9 participants each. Previous statistical training varied and students recognised their knowledge was inadequate and keen to see additional training implemented. Students were aware of the importance of statistics to their future careers, but apprehensive about learning. Face-to-face teaching supported by online resources was popular. Focus groups indicated the need for statistical training early in their degree and highlighted their lack of confidence and inconsistencies in support. The study found that the students see the importance of statistics training in the medical curriculum but that timing and mode of delivery are key. The findings have informed the design of a new course to be implemented in the third undergraduate year. Teaching will be based around published studies aiming to equip students with the basics required with additional resources available through a virtual learning environment. © The Author(s) 2015.
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Hawk, J. D.
1975-01-01
A generalized concept for cost-effective structural design is introduced. It is assumed that decisions affecting the cost effectiveness of aerospace structures fall into three basic categories: design, verification, and operation. Within these basic categories, certain decisions concerning items such as design configuration, safety factors, testing methods, and operational constraints are to be made. All or some of the variables affecting these decisions may be treated probabilistically. Bayesian statistical decision theory is used as the tool for determining the cost optimum decisions. A special case of the general problem is derived herein, and some very useful parametric curves are developed and applied to several sample structures.
On a Quantum Model of Brain Activities
NASA Astrophysics Data System (ADS)
Fichtner, K.-H.; Fichtner, L.; Freudenberg, W.; Ohya, M.
2010-01-01
One of the main activities of the brain is the recognition of signals. A first attempt to explain the process of recognition in terms of quantum statistics was given in [6]. Subsequently, details of the mathematical model were presented in a (still incomplete) series of papers (cf. [7, 2, 5, 10]). In the present note we want to give a general view of the principal ideas of this approach. We will introduce the basic spaces and justify the choice of spaces and operations. Further, we bring the model face to face with basic postulates any statistical model of the recognition process should fulfill. These postulates are in accordance with the opinion widely accepted in psychology and neurology.
Khadra, Ibrahim; Zhou, Zhou; Dunn, Claire; Wilson, Clive G; Halbert, Gavin
2015-01-25
A drug's solubility and dissolution behaviour within the gastrointestinal tract is a key property for successful administration by the oral route and one of the key factors in the biopharmaceutics classification system. This property can be determined by investigating drug solubility in human intestinal fluid (HIF) but this is difficult to obtain and highly variable, which has led to the development of multiple simulated intestinal fluid (SIF) recipes. Using a statistical design of experiment (DoE) technique this paper has investigated the effects and interactions on equilibrium drug solubility of seven typical SIF components (sodium taurocholate, lecithin, sodium phosphate, sodium chloride, pH, pancreatin and sodium oleate) within concentration ranges relevant to human intestinal fluid values. A range of poorly soluble drugs with acidic (naproxen, indomethacin, phenytoin, and piroxicam), basic (aprepitant, carvedilol, zafirlukast, tadalafil) or neutral (fenofibrate, griseofulvin, felodipine and probucol) properties have been investigated. The equilibrium solubility results determined are comparable with literature studies of the drugs in either HIF or SIF indicating that the DoE is operating in the correct space. With the exception of pancreatin, all of the factors individually had a statistically significant influence on equilibrium solubility with variations in magnitude of effect between the acidic and basic or neutral compounds and drug specific interactions were evident. Interestingly for the neutral compounds pH was the factor with the second largest solubility effect. Around one third of all the possible factor combinations showed a significant influence on equilibrium solubility with variations in interaction significance and magnitude of effect between the acidic and basic or neutral compounds. The least number of significant media component interactions were noted for the acidic compounds with three and the greatest for the neutral compounds at seven, with again drug specific effects evident. This indicates that a drug's equilibrium solubility in SIF is influenced depending upon drug type by between eight to fourteen individual or combinations of media components with some of these drug specific. This illustrates the complex nature of these fluids and provides for individual drugs a visualisation of the possible solubility envelope within the gastrointestinal tract, which may be of importance for modelling in vivo behaviour. In addition the results indicate that the design of experiment approach can be employed to provide greater detail of drug solubility behaviour, possible drug specific interactions and influence of variations in gastrointestinal media components due to disease. The approach is also feasible and amenable to adaptation for high throughput screening of drug candidates. Copyright © 2014 Elsevier B.V. All rights reserved.
Football fever: goal distributions and non-Gaussian statistics
NASA Astrophysics Data System (ADS)
Bittner, E.; Nußbaumer, A.; Janke, W.; Weigel, M.
2009-02-01
Analyzing football score data with statistical techniques, we investigate how the not purely random, but highly co-operative nature of the game is reflected in averaged properties such as the probability distributions of scored goals for the home and away teams. As it turns out, especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. Instead, a good effective description of the data is provided by less basic distributions such as the negative binomial one or the probability densities of extreme value statistics. To understand this behavior from a microscopical point of view, however, no waiting time problem or extremal process need be invoked. Instead, modifying the Bernoulli random process underlying the Poissonian model to include a simple component of self-affirmation seems to describe the data surprisingly well and allows to understand the observed deviation from Gaussian statistics. The phenomenological distributions used before can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the “FIFA World Cup” series, and found the proposed models to be applicable rather universally. In particular, here we analyze the results of the German women’s premier football league and consider the two separate German men’s premier leagues in the East and West during the cold war times as well as the unified league after 1990 to see how scoring in football and the component of self-affirmation depend on cultural and political circumstances.
Freeman, Jenny V; Collier, Steve; Staniforth, David; Smith, Kevin J
2008-01-01
Background Statistics is relevant to students and practitioners in medicine and health sciences and is increasingly taught as part of the medical curriculum. However, it is common for students to dislike and under-perform in statistics. We sought to address these issues by redesigning the way that statistics is taught. Methods The project brought together a statistician, clinician and educational experts to re-conceptualize the syllabus, and focused on developing different methods of delivery. New teaching materials, including videos, animations and contextualized workbooks were designed and produced, placing greater emphasis on applying statistics and interpreting data. Results Two cohorts of students were evaluated, one with old style and one with new style teaching. Both were similar with respect to age, gender and previous level of statistics. Students who were taught using the new approach could better define the key concepts of p-value and confidence interval (p < 0.001 for both). They were more likely to regard statistics as integral to medical practice (p = 0.03), and to expect to use it in their medical career (p = 0.003). There was no significant difference in the numbers who thought that statistics was essential to understand the literature (p = 0.28) and those who felt comfortable with the basics of statistics (p = 0.06). More than half the students in both cohorts felt that they were comfortable with the basics of medical statistics. Conclusion Using a variety of media, and placing emphasis on interpretation can help make teaching, learning and understanding of statistics more people-centred and relevant, resulting in better outcomes for students. PMID:18452599
ERIC Educational Resources Information Center
Cunningham, Phyllis M.
Intending to explore the interaction effects of self-esteem level and perceived program utility on the retention and cognitive achievement of adult basic education students, a self-esteem instrument, to be administered verbally, was constructed with content relevant items developed from and tested on a working class, undereducated, black, adult…
ERIC Educational Resources Information Center
Kotzé, Sanet Henriët; Mole, Calvin Gerald
2015-01-01
At Stellenbosch University, South Africa, basic histology is taught to a combination class of almost 400 first-year medical, physiotherapy, and dietetic students. Many students often find the amount of work in basic histology lectures overwhelming and consequently loose interest. The aim was to determine if a draw-along mapping activity would…
ERIC Educational Resources Information Center
International Business Machines Corp., Gaithersburg, MD. Federal Systems Div.
A study of computer-assisted instruction (CAI) for US Army basic electronics training at the US Army Signal Center and School establishes the feasibility of CAI as a training technique. Three aspects of CAI are considered: effectiveness, efficiency, and applicability of CAI to basic electronics training. The study explores the effectiveness of the…
Nageeb El-Helaly, Sara; Habib, Basant A; Abd El-Rahman, Mohamed K
2018-07-01
This study aims to investigate factors affecting weakly basic drugs liposomal systems. Resolution V fractional factorial design (2 V 5-1 ) is used as an example of screening designs that would better be used as a wise step before proceeding with detailed factors effects or optimization studies. Five factors probable to affect liposomal systems of weakly basic drugs were investigated using Amisulpride as a model drug. Factors studied were; A: Preparation technique B: Phosphatidyl choline (PhC) amount (mg) C: Cholesterol: PhC molar ratio, D: Hydration volume (ml) and E: Sonication type. Levels investigated were; Ammonium sulphate-pH gradient technique or Transmembrane zinc chelation-pH gradient technique, 200 or 400 mg, 0 or 0.5, 10 or 20 ml and bath or probe sonication for A, B, C, D and E respectively. Responses measured were Particle size (PS) (nm), Zeta potential (ZP) (mV) and Entrapment efficiency percent (EE%). Ion selective electrode was used as a novel method for measuring unentrapped drug concentration and calculating entrapment efficiency without the need for liposomal separation. Factors mainly affecting the studied responses were Cholesterol: PhC ratio and hydration volume for PS, preparation technique for ZP and preparation technique and hydration volume for EE%. The applied 2 V 5-1 design enabled the use of only 16 trial combinations for screening the influence of five factors on weakly basic drugs liposomal systems. This clarifies the value of the use of screening experiments before extensive investigation of certain factors in detailed optimization studies. Copyright © 2018 Elsevier B.V. All rights reserved.
Summary Statistics of CPB-Qualified Public Radio Stations: Fiscal Year 1971.
ERIC Educational Resources Information Center
Lee, S. Young; Pedone, Ronald J.
Basic statistics on finance, employment, and broadcast and production activities of 103 Corporation for Public Broadcasting (CPB)--qualified radio stations in the United States and Puerto Rico for Fiscal Year 1971 are collected. The first section of the report deals with total funds, income, direct operating costs, capital expenditures, and other…
Using Statistics to Lie, Distort, and Abuse Data
ERIC Educational Resources Information Center
Bintz, William; Moore, Sara; Adams, Cheryll; Pierce, Rebecca
2009-01-01
Statistics is a branch of mathematics that involves organization, presentation, and interpretation of data, both quantitative and qualitative. Data do not lie, but people do. On the surface, quantitative data are basically inanimate objects, nothing more than lifeless and meaningless symbols that appear on a page, calculator, computer, or in one's…
What Software to Use in the Teaching of Mathematical Subjects?
ERIC Educational Resources Information Center
Berežný, Štefan
2015-01-01
We can consider two basic views, when using mathematical software in the teaching of mathematical subjects. First: How to learn to use specific software for the specific tasks, e. g., software Statistica for the subjects of Applied statistics, probability and mathematical statistics, or financial mathematics. Second: How to learn to use the…
Intrex Subject/Title Inverted-File Characteristics.
ERIC Educational Resources Information Center
Uemura, Syunsuke
The characteristics of the Intrex subject/title inverted file are analyzed. Basic statistics of the inverted file are presented including various distributions of the index words and terms from which the file was derived, and statistics on stems, the file growth process, and redundancy measurements. A study of stems both with extremely high and…
ERIC Educational Resources Information Center
Ramseyer, Gary C.; Tcheng, Tse-Kia
The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)
ERIC Educational Resources Information Center
Dexter, Franklin; Masursky, Danielle; Wachtel, Ruth E.; Nussmeier, Nancy A.
2010-01-01
Operating room (OR) management differs from clinical anesthesia in that statistical literacy is needed daily to make good decisions. Two of the authors teach a course in operations research for surgical services to anesthesiologists, anesthesia residents, OR nursing directors, hospital administration students, and analysts to provide them with the…
Statistics and Data Interpretation for Social Work
ERIC Educational Resources Information Center
Rosenthal, James A.
2011-01-01
Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…
Using Excel in Teacher Education for Sustainability
ERIC Educational Resources Information Center
Aydin, Serhat
2016-01-01
In this study, the feasibility of using Excel software in teaching whole Basic Statistics Course and its influence on the attitudes of pre-service science teachers towards statistics were investigated. One hundred and two pre-service science teachers in their second year participated in the study. The data were collected from the prospective…
Basic Math Skills and Performance in an Introductory Statistics Course
ERIC Educational Resources Information Center
Johnson, Marianne; Kuennen, Eric
2006-01-01
We identify the student characteristics most associated with success in an introductory business statistics class, placing special focus on the relationship between student math skills and course performance, as measured by student grade in the course. To determine which math skills are important for student success, we examine (1) whether the…
An Online Course of Business Statistics: The Proportion of Successful Students
ERIC Educational Resources Information Center
Pena-Sanchez, Rolando
2009-01-01
This article describes the students' academic progress in an online course of business statistics through interactive software assignments and diverse educational homework, which helps these students to build their own e-learning through basic competences; i.e. interpreting results and solving problems. Cross-tables were built for the categorical…
Schedl, Markus
2017-01-01
Recently, the LFM-1b dataset has been proposed to foster research and evaluation in music retrieval and music recommender systems, Schedl (Proceedings of the ACM International Conference on Multimedia Retrieval (ICMR). New York, 2016). It contains more than one billion music listening events created by more than 120,000 users of Last.fm. Each listening event is characterized by artist, album, and track name, and further includes a timestamp. Basic demographic information and a selection of more elaborate listener-specific descriptors are included as well, for anonymized users. In this article, we reveal information about LFM-1b's acquisition and content and we compare it to existing datasets. We furthermore provide an extensive statistical analysis of the dataset, including basic properties of the item sets, demographic coverage, distribution of listening events (e.g., over artists and users), and aspects related to music preference and consumption behavior (e.g., temporal features and mainstreaminess of listeners). Exploiting country information of users and genre tags of artists, we also create taste profiles for populations and determine similar and dissimilar countries in terms of their populations' music preferences. Finally, we illustrate the dataset's usage in a simple artist recommendation task, whose results are intended to serve as baseline against which more elaborate techniques can be assessed.
Gambarini, G; Di Nardo, D; Miccoli, G; Guerra, F; Di Giorgio, R; Di Giorgio, G; Glassman, G; Piasecki, L; Testarelli, L
2017-01-01
Previous studies showed that motor motions play an important role in determining apical extrusion of debris. Therefore a new clinical motion (MIMERACI) has been proposed. The basic idea is to progress slowly (1mm advancement), and after each 1mm, to remove the instrument from the canal, clean flutes and irrigate. The aim of the study was to prove whether the clinical use of MIMERACI technique would influence or not postoperative pain. 100 teeth requesting endodontic treatment were selected for the study and divided into two similar groups based on anatomy, pre-operative symptoms and vitality, presence or absence of periapical lesion. All teeth were shaped, cleaned and obturated by the same operator, using the same NiTi instruments. The only difference between the two groups was the instrumentation technique: tradional (group A) vs MIMERACI (group B). Assessment of postoperative pain was performed 3 days after treatment. Presence, absence and degree of pain were recorded with a visual analogue scale (VAS), validated in previous studies. Collected data statistically analyzed using one-way ANOVA post hoc Tukey test. For VAS pain scores MIMERACI technique showed significantly better results than group A (p=0,031). Overall, both incidence and intensity of symptoms were significantly lower. Flare ups occurred in 3 patients, but none treated with the MIMERACI Technique. Since extruded debris can elicit more postoperative pain, results obtained by using MIMERACI technique are probably due to many factors: better mechanical removal and less production of debris and more efficient irrigation during instrumentation.
Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.
Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo
2015-11-01
The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.
Health Literacy Impact on National Healthcare Utilization and Expenditure.
Rasu, Rafia S; Bawa, Walter Agbor; Suminski, Richard; Snella, Kathleen; Warady, Bradley
2015-08-17
Health literacy presents an enormous challenge in the delivery of effective healthcare and quality outcomes. We evaluated the impact of low health literacy (LHL) on healthcare utilization and healthcare expenditure. Database analysis used Medical Expenditure Panel Survey (MEPS) from 2005-2008 which provides nationally representative estimates of healthcare utilization and expenditure. Health literacy scores (HLSs) were calculated based on a validated, predictive model and were scored according to the National Assessment of Adult Literacy (NAAL). HLS ranged from 0-500. Health literacy level (HLL) and categorized in 2 groups: Below basic or basic (HLS <226) and above basic (HLS ≥226). Healthcare utilization expressed as a physician, nonphysician, or emergency room (ER) visits and healthcare spending. Expenditures were adjusted to 2010 rates using the Consumer Price Index (CPI). A P value of 0.05 or less was the criterion for statistical significance in all analyses. Multivariate regression models assessed the impact of the predicted HLLs on outpatient healthcare utilization and expenditures. All analyses were performed with SAS and STATA® 11.0 statistical software. The study evaluated 22 599 samples representing 503 374 648 weighted individuals nationally from 2005-2008. The cohort had an average age of 49 years and included more females (57%). Caucasian were the predominant racial ethnic group (83%) and 37% of the cohort were from the South region of the United States of America. The proportion of the cohort with basic or below basic health literacy was 22.4%. Annual predicted values of physician visits, nonphysician visits, and ER visits were 6.6, 4.8, and 0.2, respectively, for basic or below basic compared to 4.4, 2.6, and 0.1 for above basic. Predicted values of office and ER visits expenditures were $1284 and $151, respectively, for basic or below basic and $719 and $100 for above basic (P < .05). The extrapolated national estimates show that the annual costs for prescription alone for adults with LHL possibly associated with basic and below basic health literacy could potentially reach about $172 billion. Health literacy is inversely associated with healthcare utilization and expenditure. Individuals with below basic or basic HLL have greater healthcare utilization and expendituresspending more on prescriptions compared to individuals with above basic HLL. Public health strategies promoting appropriate education among individuals with LHL may help to improve health outcomes and reduce unnecessary healthcare visits and costs. © 2015 by Kerman University of Medical Sciences.
Somaraj, Vinej; Shenoy, Rekha P; Panchmal, Ganesh Shenoy; Jodalli, Praveen S; Sonde, Laxminarayan; Karkal, Ravichandra
2017-01-01
This cross-sectional study aimed to assess the knowledge, attitude and anxiety pertaining to basic life support (BLS) and medical emergencies among interns in dental colleges of Mangalore city, Karnataka, India. The study subjects comprised of interns who volunteered from the four dental colleges. The knowledge and attitude of interns were assessed using a 30-item questionnaire prepared based on the Basic Life Support Manual from American Heart Association and the anxiety of interns pertaining to BLS and medical emergencies were assessed using a State-Trait Anxiety Inventory (STAI) Questionnaire. Chi-square test was performed on SPSS 21.0 (IBM Statistics, 2012) to determine statistically significant differences ( P <0.05) between assessed knowledge and anxiety. Out of 183 interns, 39.89% had below average knowledge. A total of 123 (67.21%) reported unavailability of professional training. The majority (180, 98.36%) felt the urgent need of training in basic life support procedures. Assessment of stress showed a total of 27.1% participants to be above high-stress level. Comparison of assessed knowledge and stress was found to be insignificant ( P =0.983). There was an evident lack of knowledge pertaining to the management of medical emergencies among the interns. As oral health care providers moving out to the society, a focus should be placed on the training of dental interns with respect to Basic Life Support procedures.
Dunn, Thomas M; Dalton, Alice; Dorfman, Todd; Dunn, William W
2004-01-01
To be a first step in determining whether emergency medicine technician (EMT)-Basics are capable of using a protocol that allows for selective immobilization of the cervical spine. Such protocols are coming into use at an advanced life support level and could be beneficial when used by basic life support providers. A convenience sample of participants (n=95) from 11 emergency medical services agencies and one college class participated in the study. All participants evaluated six patients in written scenarios and decided which should be placed into spinal precautions according to a selective spinal immobilization protocol. Systems without an existing selective spinal immobilization protocol received a one-hour continuing education lecture regarding the topic. College students received a similar lecture written so laypersons could understand the protocol. All participants showed proficiency when applying a selective immobilization protocol to patients in paper-based scenarios. Furthermore, EMT-Basics performed at the same level as paramedics when following the protocol. Statistical analysis revealed no significant differences between EMT-Basics and paramedics. A follow-up group of college students (added to have a non-EMS comparison group) also performed as well as paramedics when making decisions to use spinal precautions. Differences between college students and paramedics were also statistically insignificant. The results suggest that EMT-Basics are as accurate as paramedics when making decisions regarding selective immobilization of the cervical spine during paper-based scenarios. That laypersons are also proficient when using the protocol could indicate that it is extremely simple to follow. This study is a first step toward the necessary additional studies evaluating the efficacy of EMT-Basics using selective immobilization as a regular practice.
Technique for Reduction of Environmental Pollution from Construction Wastes
NASA Astrophysics Data System (ADS)
Bakaeva, N. V.; Klimenko, M. Y.
2017-11-01
The results of the research on the negative impact construction wastes have on the urban environment and construction ecological safety are described. The research results are based on the statistical data and indicators calculated with the use of environmental pollution assessment in the restoration system of urban buildings technical conditions. The technique for the reduction of environmental pollution from construction wastes is scientifically based on the analytic summary of scientific and practical results for ecological safety ensuring at major overhaul and current repairs (reconstruction) of the buildings and structures. It is also based on the practical application of the probability theory method, system analysis and disperse system theory. It is necessary to execute some stages implementing the developed technique to reduce environmental pollution from construction wastes. The stages include various steps starting from information collection to the system formation with optimum performance characteristics which are more resource saving and energy efficient for the accumulation of construction wastes from urban construction units. The following tasks are solved under certain studies: basic data collection about construction wastes accumulation; definition and comparison of technological combinations at each system functional stage intended for the reduction of construction wastes discharge into the environment; assessment criteria calculation of resource saving and energy efficiency; optimum working parameters of each implementation stage are created. The urban construction technique implementation shows that the resource saving criteria are from 55.22% to 88.84%; potential of construction wastes recycling is 450 million tons of construction damaged elements (parts).
DOT National Transportation Integrated Search
1982-04-01
A comprehensive review of existing basic diagnostic techniques applicable to the railcar roller bearing defect and failure problem was made. Of the potentially feasible diagnostic techniques identified, high frequency vibration was selected for exper...
Kramer, Christian; Fuchs, Julian E; Liedl, Klaus R
2015-03-23
Nonadditivity in protein-ligand affinity data represents highly instructive structure-activity relationship (SAR) features that indicate structural changes and have the potential to guide rational drug design. At the same time, nonadditivity is a challenge for both basic SAR analysis as well as many ligand-based data analysis techniques such as Free-Wilson Analysis and Matched Molecular Pair analysis, since linear substituent contribution models inherently assume additivity and thus do not work in such cases. While structural causes for nonadditivity have been analyzed anecdotally, no systematic approaches to interpret and use nonadditivity prospectively have been developed yet. In this contribution, we lay the statistical framework for systematic analysis of nonadditivity in a SAR series. First, we develop a general metric to quantify nonadditivity. Then, we demonstrate the non-negligible impact of experimental uncertainty that creates apparent nonadditivity, and we introduce techniques to handle experimental uncertainty. Finally, we analyze public SAR data sets for strong nonadditivity and use recourse to the original publications and available X-ray structures to find structural explanations for the nonadditivity observed. We find that all cases of strong nonadditivity (ΔΔpKi and ΔΔpIC50 > 2.0 log units) with sufficient structural information to generate reasonable hypothesis involve changes in binding mode. With the appropriate statistical basis, nonadditivity analysis offers a variety of new attempts for various areas in computer-aided drug design, including the validation of scoring functions and free energy perturbation approaches, binding pocket classification, and novel features in SAR analysis tools.
Basic Techniques in Mammalian Cell Tissue Culture.
Phelan, Katy; May, Kristin M
2016-11-01
Cultured mammalian cells are used extensively in cell biology studies. It requires a number of special skills in order to be able to preserve the structure, function, behavior, and biology of the cells in culture. This unit describes the basic skills required to maintain and preserve cell cultures: maintaining aseptic technique, preparing media with the appropriate characteristics, passaging, freezing and storage, recovering frozen stocks, and counting viable cells. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
Kim, Kiyeon; Omori, Ryosuke; Ito, Kimihito
2017-12-01
The estimation of the basic reproduction number is essential to understand epidemic dynamics, and time series data of infected individuals are usually used for the estimation. However, such data are not always available. Methods to estimate the basic reproduction number using genealogy constructed from nucleotide sequences of pathogens have been proposed so far. Here, we propose a new method to estimate epidemiological parameters of outbreaks using the time series change of Tajima's D statistic on the nucleotide sequences of pathogens. To relate the time evolution of Tajima's D to the number of infected individuals, we constructed a parsimonious mathematical model describing both the transmission process of pathogens among hosts and the evolutionary process of the pathogens. As a case study we applied this method to the field data of nucleotide sequences of pandemic influenza A (H1N1) 2009 viruses collected in Argentina. The Tajima's D-based method estimated basic reproduction number to be 1.55 with 95% highest posterior density (HPD) between 1.31 and 2.05, and the date of epidemic peak to be 10th July with 95% HPD between 22nd June and 9th August. The estimated basic reproduction number was consistent with estimation by birth-death skyline plot and estimation using the time series of the number of infected individuals. These results suggested that Tajima's D statistic on nucleotide sequences of pathogens could be useful to estimate epidemiological parameters of outbreaks. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Computer assisted screening, correction, and analysis of historical weather measurements
NASA Astrophysics Data System (ADS)
Burnette, Dorian J.; Stahle, David W.
2013-04-01
A computer program, Historical Observation Tools (HOB Tools), has been developed to facilitate many of the calculations used by historical climatologists to develop instrumental and documentary temperature and precipitation datasets and makes them readily accessible to other researchers. The primitive methodology used by the early weather observers makes the application of standard techniques difficult. HOB Tools provides a step-by-step framework to visually and statistically assess, adjust, and reconstruct historical temperature and precipitation datasets. These routines include the ability to check for undocumented discontinuities, adjust temperature data for poor thermometer exposures and diurnal averaging, and assess and adjust daily precipitation data for undercount. This paper provides an overview of the Visual Basic.NET program and a demonstration of how it can assist in the development of extended temperature and precipitation datasets using modern and early instrumental measurements from the United States.
Structural equation modeling: building and evaluating causal models: Chapter 8
Grace, James B.; Scheiner, Samuel M.; Schoolmaster, Donald R.
2015-01-01
Scientists frequently wish to study hypotheses about causal relationships, rather than just statistical associations. This chapter addresses the question of how scientists might approach this ambitious task. Here we describe structural equation modeling (SEM), a general modeling framework for the study of causal hypotheses. Our goals are to (a) concisely describe the methodology, (b) illustrate its utility for investigating ecological systems, and (c) provide guidance for its application. Throughout our presentation, we rely on a study of the effects of human activities on wetland ecosystems to make our description of methodology more tangible. We begin by presenting the fundamental principles of SEM, including both its distinguishing characteristics and the requirements for modeling hypotheses about causal networks. We then illustrate SEM procedures and offer guidelines for conducting SEM analyses. Our focus in this presentation is on basic modeling objectives and core techniques. Pointers to additional modeling options are also given.
A publication database for optical long baseline interferometry
NASA Astrophysics Data System (ADS)
Malbet, Fabien; Mella, Guillaume; Lawson, Peter; Taillifet, Esther; Lafrasse, Sylvain
2010-07-01
Optical long baseline interferometry is a technique that has generated almost 850 refereed papers to date. The targets span a large variety of objects from planetary systems to extragalactic studies and all branches of stellar physics. We have created a database hosted by the JMMC and connected to the Optical Long Baseline Interferometry Newsletter (OLBIN) web site using MySQL and a collection of XML or PHP scripts in order to store and classify these publications. Each entry is defined by its ADS bibcode, includes basic ADS informations and metadata. The metadata are specified by tags sorted in categories: interferometric facilities, instrumentation, wavelength of operation, spectral resolution, type of measurement, target type, and paper category, for example. The whole OLBIN publication list has been processed and we present how the database is organized and can be accessed. We use this tool to generate statistical plots of interest for the community in optical long baseline interferometry.
High-pT Physics in the Heavy Ion Era
NASA Astrophysics Data System (ADS)
Rak, Jan; Tannenbaum, Michael J.
2013-04-01
1. Introduction and overview; 2. Basic observables; 3. Some experimental techniques; 4. The search for structure; 5. Origins of high pT physics - the search for the W boson; 6. Discovery of hard scattering in p-p collisions; 7. Direct single lepton production and the discovery of charm; 8. J/ ψ, u and Drell-Yan pair production; 9. Two particle correlations; 10. Direct photon production; 11. The search for jets; 12. QCD in hard scattering; 13. Heavy ion physics in the high pT era; 14. RHIC and LHC; Appendix A. Probability and statistics; Appendix B. Methods of Monte Carlo calculations; Appendix C. TAB and the Glauber Monte Carlo calculation; Appendix D. Fits including systematic errors; Appendix E. The shape of the xE distribution triggered by a jet fragment, for example, π0; Appendix F. kT phenomenology and Gaussian smearing; References; Index.
Utilization of satellite data for inventorying prairie ponds and lakes
Work, E.A.; Gilmer, D.S.
1976-01-01
By using data acquired by LANDSAT-1 (formerly ERTS- 1), studies were conducted in extracting information necessary for formulating management decisions relating to migratory waterfowl. Management decisions are based in part on an assessment ofhabitat characteristics, specifically numbers, distribution, and quality of ponds and lakes in the prime breeding range. This paper reports on a study concerned with mapping open surface water features in the glaciated prairies. Emphasis was placed on the recognition of these features based upon water's uniquely low radiance in a single nearinfrared waveband. The results of this recognition were thematic maps and statistics relating to open surface water. In a related effort, the added information content of multiple spectral wavebands was used for discriminating surface water at a level of detail finer than the virtual resolution of the data. The basic theory of this technique and some preliminary results are described.
Descriptive Statistical Techniques for Librarians. 2nd Edition.
ERIC Educational Resources Information Center
Hafner, Arthur W.
A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…
ERIC Educational Resources Information Center
Rubinson, Laura E.
2010-01-01
More than one third of American children cannot read at a basic level by fourth grade (Lee, Grigg, & Donahue, 2007) and those numbers are even higher for African American, Hispanic and poor White students (Boorman et al., 2007). These are alarming statistics given that the ability to read is the most basic and fundamental skill for academic…
ERIC Educational Resources Information Center
Chukwu, Leo C.; Eze, Thecla A. Y.; Agada, Fidelia Chinyelugo
2016-01-01
The study examined the availability of instructional materials at the basic education level in Enugu Education Zone of Enugu State, Nigeria. One research question and one hypothesis guided the study. The research question was answered using mean and grand mean ratings, while the hypothesis was tested using t-test statistics at 0.05 level of…
Compensation Techniques in Accelerator Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sayed, Hisham Kamal
2011-05-01
Accelerator physics is one of the most diverse multidisciplinary fields of physics, wherein the dynamics of particle beams is studied. It takes more than the understanding of basic electromagnetic interactions to be able to predict the beam dynamics, and to be able to develop new techniques to produce, maintain, and deliver high quality beams for different applications. In this work, some basic theory regarding particle beam dynamics in accelerators will be presented. This basic theory, along with applying state of the art techniques in beam dynamics will be used in this dissertation to study and solve accelerator physics problems. Twomore » problems involving compensation are studied in the context of the MEIC (Medium Energy Electron Ion Collider) project at Jefferson Laboratory. Several chromaticity (the energy dependence of the particle tune) compensation methods are evaluated numerically and deployed in a figure eight ring designed for the electrons in the collider. Furthermore, transverse coupling optics have been developed to compensate the coupling introduced by the spin rotators in the MEIC electron ring design.« less
Basic statistics with Microsoft Excel: a review.
Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-06-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.
Basic statistics with Microsoft Excel: a review
Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-01-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690
Unsupervised learning of structure in spectroscopic cubes
NASA Astrophysics Data System (ADS)
Araya, M.; Mendoza, M.; Solar, M.; Mardones, D.; Bayo, A.
2018-07-01
We consider the problem of analyzing the structure of spectroscopic cubes using unsupervised machine learning techniques. We propose representing the target's signal as a homogeneous set of volumes through an iterative algorithm that separates the structured emission from the background while not overestimating the flux. Besides verifying some basic theoretical properties, the algorithm is designed to be tuned by domain experts, because its parameters have meaningful values in the astronomical context. Nevertheless, we propose a heuristic to automatically estimate the signal-to-noise ratio parameter of the algorithm directly from data. The resulting light-weighted set of samples (≤ 1% compared to the original data) offer several advantages. For instance, it is statistically correct and computationally inexpensive to apply well-established techniques of the pattern recognition and machine learning domains; such as clustering and dimensionality reduction algorithms. We use ALMA science verification data to validate our method, and present examples of the operations that can be performed by using the proposed representation. Even though this approach is focused on providing faster and better analysis tools for the end-user astronomer, it also opens the possibility of content-aware data discovery by applying our algorithm to big data.
Speech enhancement using the modified phase-opponency model.
Deshmukh, Om D; Espy-Wilson, Carol Y; Carney, Laurel H
2007-06-01
In this paper we present a model called the Modified Phase-Opponency (MPO) model for single-channel speech enhancement when the speech is corrupted by additive noise. The MPO model is based on the auditory PO model, proposed for detection of tones in noise. The PO model includes a physiologically realistic mechanism for processing the information in neural discharge times and exploits the frequency-dependent phase properties of the tuned filters in the auditory periphery by using a cross-auditory-nerve-fiber coincidence detection for extracting temporal cues. The MPO model alters the components of the PO model such that the basic functionality of the PO model is maintained but the properties of the model can be analyzed and modified independently. The MPO-based speech enhancement scheme does not need to estimate the noise characteristics nor does it assume that the noise satisfies any statistical model. The MPO technique leads to the lowest value of the LPC-based objective measures and the highest value of the perceptual evaluation of speech quality measure compared to other methods when the speech signals are corrupted by fluctuating noise. Combining the MPO speech enhancement technique with our aperiodicity, periodicity, and pitch detector further improves its performance.
Enamel Thickness before and after Orthodontic Treatment Analysed in Optical Coherence Tomography
Koprowski, Robert; Safranow, Krzysztof; Woźniak, Krzysztof
2017-01-01
Despite the continuous development of materials and techniques of adhesive bonding, the basic procedure remains relatively constant. The technique is based on three components: etching substance, adhesive system, and composite material. The use of etchants during bonding orthodontic brackets carries the risk of damage to the enamel. Therefore, the article examines the effect of the manner of enamel etching on its thickness before and after orthodontic treatment. The study was carried out in vitro on a group of 80 teeth. It was divided into two subgroups of 40 teeth each. The procedure of enamel etching was performed under laboratory conditions. In the first subgroup, the classic method of enamel etching and the fifth-generation bonding system were used. In the second subgroup, the seventh-generation (self-etching) bonding system was used. In both groups, metal orthodontic brackets were fixed and the enamel was cleaned with a cutter fixed on the micromotor after their removal. Before and after the treatment, two-dimensional optical coherence tomography scans were performed. The enamel thickness was assessed on the two-dimensional scans. The average enamel thickness in both subgroups was not statistically significant. PMID:28243604
Propagation of angular errors in two-axis rotation systems
NASA Astrophysics Data System (ADS)
Torrington, Geoffrey K.
2003-10-01
Two-Axis Rotation Systems, or "goniometers," are used in diverse applications including telescope pointing, automotive headlamp testing, and display testing. There are three basic configurations in which a goniometer can be built depending on the orientation and order of the stages. Each configuration has a governing set of equations which convert motion between the system "native" coordinates to other base systems, such as direction cosines, optical field angles, or spherical-polar coordinates. In their simplest form, these equations neglect errors present in real systems. In this paper, a statistical treatment of error source propagation is developed which uses only tolerance data, such as can be obtained from the system mechanical drawings prior to fabrication. It is shown that certain error sources are fully correctable, partially correctable, or uncorrectable, depending upon the goniometer configuration and zeroing technique. The system error budget can be described by a root-sum-of-squares technique with weighting factors describing the sensitivity of each error source. This paper tabulates weighting factors at 67% (k=1) and 95% (k=2) confidence for various levels of maximum travel for each goniometer configuration. As a practical example, this paper works through an error budget used for the procurement of a system at Sandia National Laboratories.
Data re-arranging techniques leading to proper variable selections in high energy physics
NASA Astrophysics Data System (ADS)
Kůs, Václav; Bouř, Petr
2017-12-01
We introduce a new data based approach to homogeneity testing and variable selection carried out in high energy physics experiments, where one of the basic tasks is to test the homogeneity of weighted samples, mainly the Monte Carlo simulations (weighted) and real data measurements (unweighted). This technique is called ’data re-arranging’ and it enables variable selection performed by means of the classical statistical homogeneity tests such as Kolmogorov-Smirnov, Anderson-Darling, or Pearson’s chi-square divergence test. P-values of our variants of homogeneity tests are investigated and the empirical verification through 46 dimensional high energy particle physics data sets is accomplished under newly proposed (equiprobable) quantile binning. Particularly, the procedure of homogeneity testing is applied to re-arranged Monte Carlo samples and real DATA sets measured at the particle accelerator Tevatron in Fermilab at DØ experiment originating from top-antitop quark pair production in two decay channels (electron, muon) with 2, 3, or 4+ jets detected. Finally, the variable selections in the electron and muon channels induced by the re-arranging procedure for homogeneity testing are provided for Tevatron top-antitop quark data sets.
Visualizing turbulent mixing of gases and particles
NASA Technical Reports Server (NTRS)
Ma, Kwan-Liu; Smith, Philip J.; Jain, Sandeep
1995-01-01
A physical model and interactive computer graphics techniques have been developed for the visualization of the basic physical process of stochastic dispersion and mixing from steady-state CFD calculations. The mixing of massless particles and inertial particles is visualized by transforming the vector field from a traditionally Eulerian reference frame into a Lagrangian reference frame. Groups of particles are traced through the vector field for the mean path as well as their statistical dispersion about the mean position by using added scalar information about the root mean square value of the vector field and its Lagrangian time scale. In this way, clouds of particles in a turbulent environment are traced, not just mean paths. In combustion simulations of many industrial processes, good mixing is required to achieve a sufficient degree of combustion efficiency. The ability to visualize this multiphase mixing can not only help identify poor mixing but also explain the mechanism for poor mixing. The information gained from the visualization can be used to improve the overall combustion efficiency in utility boilers or propulsion devices. We have used this technique to visualize steady-state simulations of the combustion performance in several furnace designs.
A. C. C. Fact Book: A Statistical Profile of Allegany Community College and the Community It Serves.
ERIC Educational Resources Information Center
Andersen, Roger C.
This document is intended to be an authoritative compilation of frequently referenced basic facts concerning Allegany Community College (ACC) in Maryland. It is a statistical profile of ACC and the community it serves, divided into six sections: enrollment, students, faculty, community, support services, and general college related information.…
Basic Mathematics Test Predicts Statistics Achievement and Overall First Year Academic Success
ERIC Educational Resources Information Center
Fonteyne, Lot; De Fruyt, Filip; Dewulf, Nele; Duyck, Wouter; Erauw, Kris; Goeminne, Katy; Lammertyn, Jan; Marchant, Thierry; Moerkerke, Beatrijs; Oosterlinck, Tom; Rosseel, Yves
2015-01-01
In the psychology and educational science programs at Ghent University, only 36.1% of the new incoming students in 2011 and 2012 passed all exams. Despite availability of information, many students underestimate the scientific character of social science programs. Statistics courses are a major obstacle in this matter. Not all enrolling students…
ERIC Educational Resources Information Center
Schweizer, Karl; Steinwascher, Merle; Moosbrugger, Helfried; Reiss, Siegbert
2011-01-01
The development of research methodology competency is a major aim of the psychology curriculum at universities. Usually, three courses concentrating on basic statistics, advanced statistics and experimental methods, respectively, serve the achievement of this aim. However, this traditional curriculum-based course structure gives rise to the…
ERIC Educational Resources Information Center
Maric, Marija; Wiers, Reinout W.; Prins, Pier J. M.
2012-01-01
Despite guidelines and repeated calls from the literature, statistical mediation analysis in youth treatment outcome research is rare. Even more concerning is that many studies that "have" reported mediation analyses do not fulfill basic requirements for mediation analysis, providing inconclusive data and clinical implications. As a result, after…
Statistical estimators for monitoring spotted owls in Oregon and Washington in 1987.
Tlmothy A. Max; Ray A. Souter; Kathleen A. O' Halloran
1990-01-01
Spotted owls (Strix occidentalis) were monitored on 11 National Forests in the Pacific Northwest Region of the USDA Forest Service between March and August of 1987. The basic intent of monitoring was to provide estimates of occupancy and reproduction rates for pairs of spotted owls. This paper documents the technical details of the statistical...
Peer-Assisted Learning in Research Methods and Statistics
ERIC Educational Resources Information Center
Stone, Anna; Meade, Claire; Watling, Rosamond
2012-01-01
Feedback from students on a Level 1 Research Methods and Statistics module, studied as a core part of a BSc Psychology programme, highlighted demand for additional tutorials to help them to understand basic concepts. Students in their final year of study commonly request work experience to enhance their employability. All students on the Level 1…
Adult Basic and Secondary Education Program Statistics. Fiscal Year 1976.
ERIC Educational Resources Information Center
Cain, Sylvester H.; Whalen, Barbara A.
Reports submitted to the National Center for Education Statistics provided data for this compilation and tabulation of data on adult participants in U.S. educational programs in fiscal year 1976. In the summary section introducing the charts, it is noted that adult education programs funded under P.L. 91-230 served over 1.6 million persons--an…
ERIC Educational Resources Information Center
Goodman, Leroy V., Ed.
This is the third edition of the Education Almanac, an assemblage of statistics, facts, commentary, and basic background information about the conduct of schools in the United States. Features of this variegated volume include an introductory section on "Education's Newsiest Developments," followed by some vital educational statistics, a set of…
Theory of Financial Risk and Derivative Pricing
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2009-01-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Theory of Financial Risk and Derivative Pricing - 2nd Edition
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2003-12-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Measurement techniques investigated for detection of hydrogen chloride gas in ambient air
NASA Technical Reports Server (NTRS)
Gregory, G. L.
1976-01-01
Nine basic techniques are discussed, ranging from concentration (parts per million) to dosage only (parts per million-seconds) measurement techniques. Data for each technique include lower detection limit, response time, instrument status, and in some cases, specificity. Several techniques discussed can detect ambient hydrogen chloride concentrations below 1 part per million with a response time of seconds.
Application of redundancy in the Saturn 5 guidance and control system
NASA Technical Reports Server (NTRS)
Moore, F. B.; White, J. B.
1976-01-01
The Saturn launch vehicle's guidance and control system is so complex that the reliability of a simplex system is not adequate to fulfill mission requirements. Thus, to achieve the desired reliability, redundancy encompassing a wide range of types and levels was employed. At one extreme, the lowest level, basic components (resistors, capacitors, relays, etc.) are employed in series, parallel, or quadruplex arrangements to insure continued system operation in the presence of possible failure conditions. At the other extreme, the highest level, complete subsystem duplication is provided so that a backup subsystem can be employed in case the primary system malfunctions. In between these two extremes, many other redundancy schemes and techniques are employed at various levels. Basic redundancy concepts are covered to gain insight into the advantages obtained with various techniques. Points and methods of application of these techniques are included. The theoretical gain in reliability resulting from redundancy is assessed and compared to a simplex system. Problems and limitations encountered in the practical application of redundancy are discussed as well as techniques verifying proper operation of the redundant channels. As background for the redundancy application discussion, a basic description of the guidance and control system is included.
Application of Function-Failure Similarity Method to Rotorcraft Component Design
NASA Technical Reports Server (NTRS)
Roberts, Rory A.; Stone, Robert E.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)
2002-01-01
Performance and safety are the top concerns of high-risk aerospace applications at NASA. Eliminating or reducing performance and safety problems can be achieved with a thorough understanding of potential failure modes in the designs that lead to these problems. The majority of techniques use prior knowledge and experience as well as Failure Modes and Effects as methods to determine potential failure modes of aircraft. During the design of aircraft, a general technique is needed to ensure that every potential failure mode is considered, while avoiding spending time on improbable failure modes. In this work, this is accomplished by mapping failure modes to specific components, which are described by their functionality. The failure modes are then linked to the basic functions that are carried within the components of the aircraft. Using this technique, designers can examine the basic functions, and select appropriate analyses to eliminate or design out the potential failure modes. The fundamentals of this method were previously introduced for a simple rotating machine test rig with basic functions that are common to a rotorcraft. In this paper, this technique is applied to the engine and power train of a rotorcraft, using failures and functions obtained from accident reports and engineering drawings.
Cho, Yunju; Ahmed, Arif; Islam, Annana; Kim, Sunghwan
2015-01-01
Because of the increasing importance of heavy and unconventional crude oil as an energy source, there is a growing need for petroleomics: the pursuit of more complete and detailed knowledge of the chemical compositions of crude oil. Crude oil has an extremely complex nature; hence, techniques with ultra-high resolving capabilities, such as Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS), are necessary. FT-ICR MS has been successfully applied to the study of heavy and unconventional crude oils such as bitumen and shale oil. However, the analysis of crude oil with FT-ICR MS is not trivial, and it has pushed analysis to the limits of instrumental and methodological capabilities. For example, high-resolution mass spectra of crude oils may contain over 100,000 peaks that require interpretation. To visualize large data sets more effectively, data processing methods such as Kendrick mass defect analysis and statistical analyses have been developed. The successful application of FT-ICR MS to the study of crude oil has been critically dependent on key developments in FT-ICR MS instrumentation and data processing methods. This review offers an introduction to the basic principles, FT-ICR MS instrumentation development, ionization techniques, and data interpretation methods for petroleomics and is intended for readers having no prior experience in this field of study. © 2014 Wiley Periodicals, Inc.
A Split Forcing Technique to Reduce Log-layer Mismatch in Wall-modeled Turbulent Channel Flows
NASA Astrophysics Data System (ADS)
Deleon, Rey; Senocak, Inanc
2016-11-01
The conventional approach to sustain a flow field in a periodic channel flow seems to be the culprit behind the log-law mismatch problem that has been reported in many studies hybridizing Reynolds-averaged Navier-Stokes (RANS) and large-eddy simulation (LES) techniques, commonly referred to as hybrid RANS-LES. To address this issue, we propose a split-forcing approach that relies only on the conservation of mass principle. We adopt a basic hybrid RANS-LES technique on a coarse mesh with wall-stress boundary conditions to simulate turbulent channel flows at friction Reynolds numbers of 2000 and 5200 and demonstrate good agreement with benchmark data. We also report a duality in velocity scale that is a specific consequence of the split forcing framework applied to hybrid RANS-LES. The first scale is the friction velocity derived from the wall shear stress. The second scale arises in the core LES region, a value different than at the wall. Second-order turbulence statistics agree well with the benchmark data when normalized by the core friction velocity, whereas the friction velocity at the wall remains the appropriate scale for the mean velocity profile. Based on our findings, we suggest reevaluating more sophisticated hybrid RANS-LES approaches within the split-forcing framework. Work funded by National Science Foundation under Grant No. 1056110 and 1229709. First author acknowledges the University of Idaho President's Doctoral Scholars Award.
Predicting Success in Psychological Statistics Courses.
Lester, David
2016-06-01
Many students perform poorly in courses on psychological statistics, and it is useful to be able to predict which students will have difficulties. In a study of 93 undergraduates enrolled in Statistical Methods (18 men, 75 women; M age = 22.0 years, SD = 5.1), performance was significantly associated with sex (female students performed better) and proficiency in algebra in a linear regression analysis. Anxiety about statistics was not associated with course performance, indicating that basic mathematical skills are the best correlate for performance in statistics courses and can usefully be used to stream students into classes by ability. © The Author(s) 2016.
Chi-squared and C statistic minimization for low count per bin data
NASA Astrophysics Data System (ADS)
Nousek, John A.; Shue, David R.
1989-07-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy
NASA Technical Reports Server (NTRS)
Nousek, John A.; Shue, David R.
1989-01-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
... Surveillance References Birth Defects COUNT Data & Statistics Research Articles & Key Findings About Us Partners Links to Other Websites Information For… Media Policy Makers Folic Acid Basics Language: English (US) ...
NASA Astrophysics Data System (ADS)
Jacobson, Gloria; Rella, Chris; Farinas, Alejandro
2014-05-01
Technological advancement of instrumentation in atmospheric and other geoscience disciplines over the past decade has lead to a shift from discrete sample analysis to continuous, in-situ monitoring. Standard error analysis used for discrete measurements is not sufficient to assess and compare the error contribution of noise and drift from continuous-measurement instruments, and a different statistical analysis approach should be applied. The Allan standard deviation analysis technique developed for atomic clock stability assessment by David W. Allan [1] can be effectively and gainfully applied to continuous measurement instruments. As an example, P. Werle et al has applied these techniques to look at signal averaging for atmospheric monitoring by Tunable Diode-Laser Absorption Spectroscopy (TDLAS) [2]. This presentation will build on, and translate prior foundational publications to provide contextual definitions and guidelines for the practical application of this analysis technique to continuous scientific measurements. The specific example of a Picarro G2401 Cavity Ringdown Spectroscopy (CRDS) analyzer used for continuous, atmospheric monitoring of CO2, CH4 and CO will be used to define the basics features the Allan deviation, assess factors affecting the analysis, and explore the time-series to Allan deviation plot translation for different types of instrument noise (white noise, linear drift, and interpolated data). In addition, the useful application of using an Allan deviation to optimize and predict the performance of different calibration schemes will be presented. Even though this presentation will use the specific example of the Picarro G2401 CRDS Analyzer for atmospheric monitoring, the objective is to present the information such that it can be successfully applied to other instrument sets and disciplines. [1] D.W. Allan, "Statistics of Atomic Frequency Standards," Proc, IEEE, vol. 54, pp 221-230, Feb 1966 [2] P. Werle, R. Miicke, F. Slemr, "The Limits of Signal Averaging in Atmospheric Trace-Gas Monitoring by Tunable Diode-Laser Absorption Spectroscopy (TDLAS)," Applied Physics, B57, pp 131-139, April 1993
Solar Activity Heading for a Maunder Minimum?
NASA Astrophysics Data System (ADS)
Schatten, K. H.; Tobiska, W. K.
2003-05-01
Long-range (few years to decades) solar activity prediction techniques vary greatly in their methods. They range from examining planetary orbits, to spectral analyses (e.g. Fourier, wavelet and spectral analyses), to artificial intelligence methods, to simply using general statistical techniques. Rather than concentrate on statistical/mathematical/numerical methods, we discuss a class of methods which appears to have a "physical basis." Not only does it have a physical basis, but this basis is rooted in both "basic" physics (dynamo theory), but also solar physics (Babcock dynamo theory). The class we discuss is referred to as "precursor methods," originally developed by Ohl, Brown and Williams and others, using geomagnetic observations. My colleagues and I have developed some understanding for how these methods work and have expanded the prediction methods using "solar dynamo precursor" methods, notably a "SODA" index (SOlar Dynamo Amplitude). These methods are now based upon an understanding of the Sun's dynamo processes- to explain a connection between how the Sun's fields are generated and how the Sun broadcasts its future activity levels to Earth. This has led to better monitoring of the Sun's dynamo fields and is leading to more accurate prediction techniques. Related to the Sun's polar and toroidal magnetic fields, we explain how these methods work, past predictions, the current cycle, and predictions of future of solar activity levels for the next few solar cycles. The surprising result of these long-range predictions is a rapid decline in solar activity, starting with cycle #24. If this trend continues, we may see the Sun heading towards a "Maunder" type of solar activity minimum - an extensive period of reduced levels of solar activity. For the solar physicists, who enjoy studying solar activity, we hope this isn't so, but for NASA, which must place and maintain satellites in low earth orbit (LEO), it may help with reboost problems. Space debris, and other aspects of objects in LEO will also be affected. This research is supported by the NSF and NASA.
[Comment on] Statistical discrimination
NASA Astrophysics Data System (ADS)
Chinn, Douglas
In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.
Basic Radar Altimetry Toolbox and Radar Altimetry Tutorial: Tools for all Altimetry Users
NASA Astrophysics Data System (ADS)
Rosmorduc, Vinca; Benveniste, J.; Breebaart, L.; Bronner, E.; Dinardo, S.; Earith, D.; Lucas, B. M.; Maheu, C.; Niejmeier, S.; Picot, N.
2013-09-01
The Basic Radar Altimetry Toolbox is an "all- altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the next mission to be launched, Saral.It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. Nearly 2000 people downloaded it (January 2012), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 to 4. Others are under development, some are in discussion for the future.The Basic Radar Altimetry Toolbox is able:- to read most distributed radar altimetry data, including the one from future missions like Saral, Jason-3- to perform some processing, data editing and statistic, - and to visualize the results.It can be used at several levels/several ways, including as an educational tool, with the graphical user interface.As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data.BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. More than 2000 people downloaded it (as of end of September 2012), with many "newcomers" to altimetry among them, and teachers/students. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 and 3. Others are envisioned, some are in discussion.
The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques
ERIC Educational Resources Information Center
Menil, Violeta C.
2005-01-01
In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…
Typewriting Methodology 1977: Eight Basic Principles for Good Results
ERIC Educational Resources Information Center
Winger, Fred E.
1977-01-01
The eight basic principles of teaching methodology discussed are as follows: Stress position and technique, stress skill building, stress the pretest/practice/posttest method, stress action research, stress true production skills, stress good proofreading skills, stress performance goals, and stress individualized instruction. (TA)
Comment on Schuster's Technique for Focusing the Prism Spectrometer.
ERIC Educational Resources Information Center
Beynon, John
1991-01-01
Discussed is the physics that underpins Schuster's technique for obtaining a parallel light beam for use in various prism and grating experiments. Basic physics concepts using geometrical optics of prism, together with elementary differential calculus are explained as well as the mechanics of Schuster's technique. (KR)
Should I Pack My Umbrella? Clinical versus Statistical Prediction of Mental Health Decisions
ERIC Educational Resources Information Center
Aegisdottir, Stefania; Spengler, Paul M.; White, Michael J.
2006-01-01
In this rejoinder, the authors respond to the insightful commentary of Strohmer and Arm, Chwalisz, and Hilton, Harris, and Rice about the meta-analysis on statistical versus clinical prediction techniques for mental health judgments. The authors address issues including the availability of statistical prediction techniques for real-life psychology…
Change Detection in Rough Time Series
2014-09-01
Business Statistics : An Inferential Approach, Dellen: San Francisco. [18] Winston, W. (1997) Operations Research Applications and Algorithms, Duxbury...distribution that can present significant challenges to conventional statistical tracking techniques. To address this problem the proposed method...applies hybrid fuzzy statistical techniques to series granules instead of to individual measures. Three examples demonstrated the robust nature of the
Enhancing Students' Ability to Use Statistical Reasoning with Everyday Problems
ERIC Educational Resources Information Center
Lawson, Timothy J.; Schwiers, Michael; Doellman, Maureen; Grady, Greg; Kelnhofer, Robert
2003-01-01
We discuss a technique for teaching students everyday applications of statistical concepts. We used this technique with students (n = 50) enrolled in several sections of an introductory statistics course; students (n = 45) in other sections served as a comparison group. A class of introductory psychology students (n = 24) served as a second…
1987-08-01
HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band
Direct measurement of carbon-14 in carbon dioxide by liquid scintillation counting
NASA Technical Reports Server (NTRS)
Horrocks, D. L.
1969-01-01
Liquid scintillation counting technique is applied to the direct measurement of carbon-14 in carbon dioxide. This method has high counting efficiency and eliminates many of the basic problems encountered with previous techniques. The technique can be used to achieve a percent substitution reaction and is of interest as an analytical technique.
ERIC Educational Resources Information Center
Taylor, Marjorie; And Others
Anodizing, Inc., Teamsters Local 162, and Mt. Hood Community College (Oregon) developed a workplace literacy program for workers at Anodizing. These workers did not have the basic skill competencies to benefit from company training efforts in statistical process control and quality assurance and were not able to advance to lead and supervisory…
ERIC Educational Resources Information Center
Vizenor, Gerald
Opportunities Unlimited is a State-wide program to provide adult basic education (ABE) and training for Indians on Minnesota reservations and in Indian communities. An administrative center in Bemidji serves communities on the Red Lake, White Earth, and Leech Lake Reservations, and a Duluth center provides ABE and training for communities on the…
ERIC Educational Resources Information Center
Joireman, Jeff; Abbott, Martin L.
This report examines the overlap between student test results on the Iowa Test of Basic Skills (ITBS) and the Washington Assessment of Student Learning (WASL). The two tests were compared and contrasted in terms of content and measurement philosophy, and analyses studied the statistical relationship between the ITBS and the WASL. The ITBS assesses…
Reduction of Helicopter Blade-Vortex Interaction Noise by Active Rotor Control Technology
NASA Technical Reports Server (NTRS)
Yu, Yung H.; Gmelin, Bernd; Splettstoesser, Wolf; Brooks, Thomas F.; Philippe, Jean J.; Prieur, Jean
1997-01-01
Helicopter blade-vortex interaction noise is one of the most severe noise sources and is very important both in community annoyance and military detection. Research over the decades has substantially improved basic physical understanding of the mechanisms generating rotor blade-vortex interaction noise and also of controlling techniques, particularly using active rotor control technology. This paper reviews active rotor control techniques currently available for rotor blade vortex interaction noise reduction, including higher harmonic pitch control, individual blade control, and on-blade control technologies. Basic physical mechanisms of each active control technique are reviewed in terms of noise reduction mechanism and controlling aerodynamic or structural parameters of a blade. Active rotor control techniques using smart structures/materials are discussed, including distributed smart actuators to induce local torsional or flapping deformations, Published by Elsevier Science Ltd.
1983-05-01
in the presence of fillers or without it. The basic assumption made is that the heat of reaction is proportional to the extent of the reaction...disperse the SillllV* rVdi\\tion ^^9 • .canning machan ^m. ill isolate the frequency range falling on the detector In this manner. the spectrum...the molar orms with only has n absorb ing (nxp) and # by the udied. Of t have a all of the analysis a complete the same There are two basic
Perceptual-cognitive skills and performance in orienteering.
Guzmán, José F; Pablos, Ana M; Pablos, Carlos
2008-08-01
The goal was analysis of the perceptual-cognitive skills associated with sport performance in orienteering in a sample of 22 elite and 17 nonelite runners. Variables considered were memory, basic orienteering techniques, map reading, symbol knowledge, map-terrain-map identification, and spatial organisation. A computerised questionnaire was developed to measure the variables. The reliability of the test (agreement between experts) was 90%. Findings suggested that competence in performing basic orienteering techniques efficiently was a key variable differentiating between the elite and the nonelite athletes. The results are discussed in comparison with previous studies.
1994-03-01
evaluation of its anticipated value. If the program can be accomplished using conventional techniques , this should be seriously considered. Development or...the direct frequency generating principles such as, pulse tachos, turbine flowmeters, and encoders, also Doppler and laser techniques used for...CERAMIC BLOCK Figure 5.3. The basic concepts of the laser ring gyro (LRG). The principle depends upon the guidance of two beams of laser light around an
Fundamentals in Biostatistics for Research in Pediatric Dentistry: Part I - Basic Concepts.
Garrocho-Rangel, J A; Ruiz-Rodríguez, M S; Pozos-Guillén, A J
The purpose of this report was to provide the reader with some basic concepts in order to better understand the significance and reliability of the results of any article on Pediatric Dentistry. Currently, Pediatric Dentists need the best evidence available in the literature on which to base their diagnoses and treatment decisions for the children's oral care. Basic understanding of Biostatistics plays an important role during the entire Evidence-Based Dentistry (EBD) process. This report describes Biostatistics fundamentals in order to introduce the basic concepts used in statistics, such as summary measures, estimation, hypothesis testing, effect size, level of significance, p value, confidence intervals, etc., which are available to Pediatric Dentists interested in reading or designing original clinical or epidemiological studies.
A basic review on the inferior alveolar nerve block techniques.
Khalil, Hesham
2014-01-01
The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned.
A basic review on the inferior alveolar nerve block techniques
Khalil, Hesham
2014-01-01
The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned. PMID:25886095
Computer programs for computing particle-size statistics of fluvial sediments
Stevens, H.H.; Hubbell, D.W.
1986-01-01
Two versions of computer programs for inputing data and computing particle-size statistics of fluvial sediments are presented. The FORTRAN 77 language versions are for use on the Prime computer, and the BASIC language versions are for use on microcomputers. The size-statistics program compute Inman, Trask , and Folk statistical parameters from phi values and sizes determined for 10 specified percent-finer values from inputed size and percent-finer data. The program also determines the percentage gravel, sand, silt, and clay, and the Meyer-Peter effective diameter. Documentation and listings for both versions of the programs are included. (Author 's abstract)
Technical Note: The Initial Stages of Statistical Data Analysis
Tandy, Richard D.
1998-01-01
Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489
ERIC Educational Resources Information Center
Gadway, Charles J.; Wilson, H.A.
This document provides statistical data on the 1974 and 1975 Mini-Assessment of Functional Literacy, which was designed to determine the extent of functional literacy among seventeen year olds in America. Also presented are data from comparable test items from the 1971 assessment. Three standards are presented, to allow different methods of…
ERIC Educational Resources Information Center
Novak, Elena; Johnson, Tristan E.; Tenenbaum, Gershon; Shute, Valerie J.
2016-01-01
The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. A storyline is a game-design element that connects scenes with the educational content. In order to…
ERIC Educational Resources Information Center
Waesche, Jessica S. Brown; Schatschneider, Christopher; Maner, Jon K.; Ahmed, Yusra; Wagner, Richard K.
2011-01-01
Rates of agreement among alternative definitions of reading disability and their 1- and 2-year stabilities were examined using a new measure of agreement, the affected-status agreement statistic. Participants were 288,114 first through third grade students. Reading measures were "Dynamic Indicators of Basic Early Literacy Skills" Oral…
ERIC Educational Resources Information Center
Biehler, Rolf; Frischemeier, Daniel; Podworny, Susanne
2017-01-01
Connecting data and chance is fundamental in statistics curricula. The use of software like TinkerPlots can bridge both worlds because the TinkerPlots Sampler supports learners in expressive modeling. We conducted a study with elementary preservice teachers with a basic university education in statistics. They were asked to set up and evaluate…
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…
ERIC Educational Resources Information Center
Novak, Elena
2012-01-01
The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. In addition, the study focused on examining the effects of a storyline GC on specific learning…
A statistical mechanics approach to autopoietic immune networks
NASA Astrophysics Data System (ADS)
Barra, Adriano; Agliari, Elena
2010-07-01
In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.
Constructing Space-Time Views from Fixed Size Statistical Data: Getting the Best of both Worlds
NASA Technical Reports Server (NTRS)
Schmidt, Melisa; Yan, Jerry C.
1997-01-01
Many performance monitoring tools are currently available to the super-computing community. The performance data gathered and analyzed by these tools fall under two categories: statistics and event traces. Statistical data is much more compact but lacks the probative power event traces offer. Event traces, on the other hand, can easily fill up the entire file system during execution such that the instrumented execution may have to be terminated half way through. In this paper, we propose an innovative methodology for performance data gathering and representation that offers a middle ground. The user can trade-off tracing overhead, trace data size vs. data quality incrementally. In other words, the user will be able to limit the amount of trace collected and, at the same time, carry out some of the analysis event traces offer using space-time views for the entire execution. Two basic ideas arc employed: the use of averages to replace recording data for each instance and formulae to represent sequences associated with communication and control flow. With the help of a few simple examples, we illustrate the use of these techniques in performance tuning and compare the quality of the traces we collected vs. event traces. We found that the trace files thus obtained are, in deed, small, bounded and predictable before program execution and that the quality of the space time views generated from these statistical data are excellent. Furthermore, experimental results showed that the formulae proposed were able to capture 100% of all the sequences associated with 11 of the 15 applications tested. The performance of the formulae can be incrementally improved by allocating more memory at run-time to learn longer sequences.
Constructing Space-Time Views from Fixed Size Statistical Data: Getting the Best of Both Worlds
NASA Technical Reports Server (NTRS)
Schmidt, Melisa; Yan, Jerry C.; Bailey, David (Technical Monitor)
1996-01-01
Many performance monitoring tools are currently available to the super-computing community. The performance data gathered and analyzed by these tools fall under two categories: statistics and event traces. Statistical data is much more compact but lacks the probative power event traces offer. Event traces, on the other hand, can easily fill up the entire file system during execution such that the instrumented execution may have to be terminated half way through. In this paper, we propose an innovative methodology for performance data gathering and representation that offers a middle ground. The user can trade-off tracing overhead, trace data size vs. data quality incrementally. In other words, the user will be able to limit the amount of trace collected and, at the same time, carry out some of the analysis event traces offer using spacetime views for the entire execution. Two basic ideas are employed: the use of averages to replace recording data for each instance and "formulae" to represent sequences associated with communication and control flow. With the help of a few simple examples, we illustrate the use of these techniques in performance tuning and compare the quality of the traces we collected vs. event traces. We found that the trace files thus obtained are, in deed, small, bounded and predictable before program execution and that the quality of the space time views generated from these statistical data are excellent. Furthermore, experimental results showed that the formulae proposed were able to capture 100% of all the sequences associated with 11 of the 15 applications tested. The performance of the formulae can be incrementally improved by allocating more memory at run-time to learn longer sequences.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Drafting--Basic, Drafting--Intermediate: 9255.01.
ERIC Educational Resources Information Center
Dade County Public Schools, Miami, FL.
The course has no prerequisites, offers instruction in basic drafting room techniques and procedures, and also covers job opportunities and industrial methods in engineering. The student is introduced to and asked to perform fundamental drafting problems with working drawings, using multiview and auxiliary views and sections. The course also…
Teaching Individuals with Developmental Delays: Basic Intervention Techniques.
ERIC Educational Resources Information Center
Lovaas, O. Ivar
This teaching manual for treatment of children with developmental disabilities is divided into seven sections that address: (1) basic concepts; (2) transition into treatment; (3) early learning concepts; (4) expressive language; (5) strategies for visual learners; (6) programmatic considerations; and (7) organizational and legal issues. Among…
Basics of Desktop Publishing. Teacher Edition.
ERIC Educational Resources Information Center
Beeby, Ellen
This color-coded teacher's guide contains curriculum materials designed to give students an awareness of various desktop publishing techniques before they determine their computer hardware and software needs. The guide contains six units, each of which includes some or all of the following basic components: objective sheet, suggested activities…
A Study of Rubisco through Western Blotting and Tissue Printing Techniques
ERIC Educational Resources Information Center
Ma, Zhong; Cooper, Cynthia; Kim, Hyun-Joo; Janick-Buckner, Diane
2009-01-01
We describe a laboratory exercise developed for a cell biology course for second-year undergraduate biology majors. It was designed to introduce undergraduates to the basic molecular biology techniques of Western blotting and immunodetection coupled with the technique of tissue printing in detecting the presence, relative abundance, and…
Resiliency Techniques in School Practice
ERIC Educational Resources Information Center
Molony, Terry; Henwood, Maureen; Gilroy, Shawn
2010-01-01
School psychologists can help build resilience in youth in many ways. This article offers a list of some easy techniques to use when working with individuals or groups, most based on basic cognitive-behavior therapy (CBT) techniques. They include: (1) Emotional awareness; (2) Emotional Regulation; (3) Cognitive Flexibility; (4) Self-efficacy; and…
Optimizing Basic French Skills Utilizing Multiple Teaching Techniques.
ERIC Educational Resources Information Center
Skala, Carol
This action research project examined the impact of foreign language teaching techniques on the language acquisition and retention of 19 secondary level French I students, focusing on student perceptions of the effectiveness and ease of four teaching techniques: total physical response, total physical response storytelling, literature approach,…
NASA Technical Reports Server (NTRS)
Sherman, J. W., III
1975-01-01
The papers presented in the marine session may be broadly grouped into several classes: microwave region instruments compared to infrared and visible region sensors, satellite techniques compared to aircraft techniques, open ocean applications compared to coastal region applications, and basic research and understanding of ocean phenomena compared to research techniques that offer immediate applications.
Advanced techniques to prepare seed to sow
Robert P. Karrfalt
2013-01-01
This paper reviews research on improving the basic technique of cold stratification for tree and shrub seeds. Advanced stratification techniques include long stratification, stratification re-dry, or multiple cycles of warm-cold stratification. Research demonstrates that careful regulation of moisture levels and lengthening the stratification period have produced a...
Are Assumptions of Well-Known Statistical Techniques Checked, and Why (Not)?
Hoekstra, Rink; Kiers, Henk A. L.; Johnson, Addie
2012-01-01
A valid interpretation of most statistical techniques requires that one or more assumptions be met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another explanation could be that violations of assumptions are rarely checked for in the first place. We studied whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. Participants were asked to analyze the data as they would their own data, for which often used and well-known techniques such as the t-procedure, ANOVA and regression (or non-parametric alternatives) were required. It was found that the assumptions of the techniques were rarely checked, and that if they were, it was regularly by means of a statistical test. Interviews afterward revealed a general lack of knowledge about assumptions, the robustness of the techniques with regards to the assumptions, and how (or whether) assumptions should be checked. These data suggest that checking for violations of assumptions is not a well-considered choice, and that the use of statistics can be described as opportunistic. PMID:22593746
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-01-01
Aims and Objectives: The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. Materials and Methods: A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Results: Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions (P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant (P < 0.001). Conclusions: Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems. PMID:28713763
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures.
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-06-01
The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions ( P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant ( P < 0.001). Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems.
In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...
Incorporating Nonparametric Statistics into Delphi Studies in Library and Information Science
ERIC Educational Resources Information Center
Ju, Boryung; Jin, Tao
2013-01-01
Introduction: The Delphi technique is widely used in library and information science research. However, many researchers in the field fail to employ standard statistical tests when using this technique. This makes the technique vulnerable to criticisms of its reliability and validity. The general goal of this article is to explore how…
ERIC Educational Resources Information Center
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
Statistics in the Workplace: A Survey of Use by Recent Graduates with Higher Degrees
ERIC Educational Resources Information Center
Harraway, John A.; Barker, Richard J.
2005-01-01
A postal survey was conducted regarding statistical techniques, research methods and software used in the workplace by 913 graduates with PhD and Masters degrees in the biological sciences, psychology, business, economics, and statistics. The study identified gaps between topics and techniques learned at university and those used in the workplace,…
Sawani, Shefali; Arora, Vipin; Jaiswal, Shikha; Nikhil, Vineeta
2014-01-01
Background: Evaluation of microleakage is important for assessing the success of new restorative materials and methods. Aim and Objectives: Comparative evaluation of microleakage in Class II restorations using open vs. closed centripetal build-up techniques with different lining materials. Materials and Methods: Standardized mesi-occlusal (MO) and distoocclusal (DO) Class II tooth preparations were preparedon 53 molars and samples were randomly divided into six experimental groups and one control group for restorations. Group 1: Open-Sandwich technique (OST) with flowable composite at the gingival seat. Group 2: OST with resin-modified glass ionomer cement (RMGIC) at the gingival seat. Group 3: Closed-Sandwich technique (CST) with flowable composite at the pulpal floor and axial wall. Group 4: CST with RMGIC at the pulpal floor and axial wall. Group 5: OST with flowable composite at the pulpal floor, axial wall, and gingival seat. Group 6: OST with RMGIC at the pulpal floor, axial wall, and gingival seat. Group 7: Control — no lining material, centripetal technique only. After restorations and thermocycling, apices were sealed and samples were immersed in 0.5% basic fuchsin dye. Sectioning was followed by stereomicroscopic evaluation. Results: Results were analyzed using Post Hoc Bonferroni test (statistics is not a form of tabulation). Cervical scores of control were more than the exprimental groups (P < 0.05). Less microleakage was observed in CST than OST in all experimental groups (P < 0.05). However, insignificant differences were observed among occlusal scores of different groups (P > 0.05). Conclusion: Class II composite restorations with centripetal build-up alone or when placed with CST reduces the cervical microleakage when compared to OST. PMID:25125847
Facts about Congenital Heart Defects
... Living With Heart Defects Data & Statistics Tracking & Research Articles & Key Findings Free Materials Multimedia and Tools Links to Other Websites Information For… Media Policy Makers Basics about Congenital Heart Defects Language: ...
... Cervical Cancer with the Right Test at the Right Time” Infographic How Is Cervical Cancer Diagnosed and Treated? Statistics Related Links Ovarian Cancer Basic Information What Are the Risk Factors? What Can ...
du Prel, Jean-Baptist; Röhrig, Bernd; Blettner, Maria
2009-02-01
In the era of evidence-based medicine, one of the most important skills a physician needs is the ability to analyze scientific literature critically. This is necessary to keep medical knowledge up to date and to ensure optimal patient care. The aim of this paper is to present an accessible introduction into critical appraisal of scientific articles. Using a selection of international literature, the reader is introduced to the principles of critical reading of scientific articles in medicine. For the sake of conciseness, detailed description of statistical methods is omitted. Widely accepted principles for critically appraising scientific articles are outlined. Basic knowledge of study design, structuring of an article, the role of different sections, of statistical presentations as well as sources of error and limitation are presented. The reader does not require extensive methodological knowledge. As far as necessary for critical appraisal of scientific articles, differences in research areas like epidemiology, clinical, and basic research are outlined. Further useful references are presented. Basic methodological knowledge is required to select and interpret scientific articles correctly.
NASA Astrophysics Data System (ADS)
Li, Ziyi
2017-12-01
Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.
Computer program for the calculation of grain size statistics by the method of moments
Sawyer, Michael B.
1977-01-01
A computer program is presented for a Hewlett-Packard Model 9830A desk-top calculator (1) which calculates statistics using weight or point count data from a grain-size analysis. The program uses the method of moments in contrast to the more commonly used but less inclusive graphic method of Folk and Ward (1957). The merits of the program are: (1) it is rapid; (2) it can accept data in either grouped or ungrouped format; (3) it allows direct comparison with grain-size data in the literature that have been calculated by the method of moments; (4) it utilizes all of the original data rather than percentiles from the cumulative curve as in the approximation technique used by the graphic method; (5) it is written in the computer language BASIC, which is easily modified and adapted to a wide variety of computers; and (6) when used in the HP-9830A, it does not require punching of data cards. The method of moments should be used only if the entire sample has been measured and the worker defines the measured grain-size range. (1) Use of brand names in this paper does not imply endorsement of these products by the U.S. Geological Survey.
Rudnicki, Jacek; Boberski, Marek; Butrymowicz, Ewa; Niedbalski, Paweł; Ogniewski, Paweł; Niedbalski, Marek; Niedbalski, Zbigniew; Podraza, Wojciech; Podraza, Hanna
2012-08-01
Stimulation of the nervous system plays an important role in brain function and psychomotor development of children. Massage can benefit premature infants, but has limitations. The authors conducted a study to verify the direct effects of massage on amplitude-integrated electroencephalography (aEEG), oxygen saturation (SaO(2)), and pulse analyzed by color cerebral function monitor (CCFM) and cerebral blood flow assessed by the Doppler technique. The amplitude of the aEEG trend during massage significantly increased. Massage also impacted the dominant frequency δ waves. Frequency significantly increased during the massage and return to baseline after treatment. SaO(2) significantly decreased during massage. In four premature infants, massage was discontinued due to desaturation below 85%. Pulse frequency during the massage decreased but remained within physiological limits of greater than 100 beats per minute in all infants. Doppler flow values in the anterior cerebral artery measured before and after massage did not show statistically significant changes. Resistance index after massage decreased, which might provide greater perfusion of the brain, but this difference was not statistically significant. Use of the CCFM device allows for monitoring of three basic physiologic functions, namely aEEG, SaO(2), and pulse, and increases the safety of massage in preterm infants. Copyright © 2012 by Thieme Medical Publishers
Building integral projection models: a user's guide
Rees, Mark; Childs, Dylan Z; Ellner, Stephen P; Coulson, Tim
2014-01-01
In order to understand how changes in individual performance (growth, survival or reproduction) influence population dynamics and evolution, ecologists are increasingly using parameterized mathematical models. For continuously structured populations, where some continuous measure of individual state influences growth, survival or reproduction, integral projection models (IPMs) are commonly used. We provide a detailed description of the steps involved in constructing an IPM, explaining how to: (i) translate your study system into an IPM; (ii) implement your IPM; and (iii) diagnose potential problems with your IPM. We emphasize how the study organism's life cycle, and the timing of censuses, together determine the structure of the IPM kernel and important aspects of the statistical analysis used to parameterize an IPM using data on marked individuals. An IPM based on population studies of Soay sheep is used to illustrate the complete process of constructing, implementing and evaluating an IPM fitted to sample data. We then look at very general approaches to parameterizing an IPM, using a wide range of statistical techniques (e.g. maximum likelihood methods, generalized additive models, nonparametric kernel density estimators). Methods for selecting models for parameterizing IPMs are briefly discussed. We conclude with key recommendations and a brief overview of applications that extend the basic model. The online Supporting Information provides commented R code for all our analyses. PMID:24219157
Basic Radar Altimetry Toolbox: Tools and Tutorial to Use Cryosat Data
NASA Astrophysics Data System (ADS)
Benveniste, J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.; Niemeijer, S.
2011-12-01
Radar altimetry is very much a technique expanding its applications. Even If quite a lot of effort has been invested for oceanography users, the use of Altimetry data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious for new Altimetry data products users. ESA and CNES therfore developed the Basic Radar Altimetry Toolbox a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat, the future Saral missions and is ready for adaptation to Sentinel-3 products - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available since April 2007, and had been demonstrated during training courses and scientific meetings. About 2000 people downloaded it (Summer 2011), with many "newcomers" to altimetry among them, including teachers and professors, worldwide. Users' feedback, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in the recent version release (v3.0.1). Others are in discussion for future development. Data use cases on CryoSat data use will be presented. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.
A comprehensive neuropsychological mapping battery for functional magnetic resonance imaging.
Karakas, Sirel; Baran, Zeynel; Ceylan, Arzu Ozkan; Tileylioglu, Emre; Tali, Turgut; Karakas, Hakki Muammer
2013-11-01
Existing batteries for FMRI do not precisely meet the criteria for comprehensive mapping of cognitive functions within minimum data acquisition times using standard scanners and head coils. The goal was to develop a battery of neuropsychological paradigms for FMRI that can also be used in other brain imaging techniques and behavioural research. Participants were 61 healthy, young adult volunteers (48 females and 13 males, mean age: 22.25 ± 3.39 years) from the university community. The battery included 8 paradigms for basic (visual, auditory, sensory-motor, emotional arousal) and complex (language, working memory, inhibition/interference control, learning) cognitive functions. Imaging was performed using standard functional imaging capabilities (1.5-T MR scanner, standard head coil). Structural and functional data series were analysed using Brain Voyager QX2.9 and Statistical Parametric Mapping-8. For basic processes, activation centres for individuals were within a distance of 3-11 mm of the group centres of the target regions and for complex cognitive processes, between 7 mm and 15 mm. Based on fixed-effect and random-effects analyses, the distance between the activation centres was 0-4 mm. There was spatial variability between individual cases; however, as shown by the distances between the centres found with fixed-effect and random-effects analyses, the coordinates for individual cases can be used to represent those of the group. The findings show that the neuropsychological brain mapping battery described here can be used in basic science studies that investigate the relationship of the brain to the mind and also as functional localiser in clinical studies for diagnosis, follow-up and pre-surgical mapping. © 2013.
ERIC Educational Resources Information Center
Amado, Diana; Del Villar, Fernando; Sánchez-Miguel, Pedro Antonio; Leo, Francisco Miguel; García-Calvo, Tomás
2016-01-01
The aim of this study was to learn about the effectiveness of two dance teaching techniques, the creative examination technique and the direct instruction technique, on the satisfaction of basic psychological needs, the level of self-determination, the perception of usefulness, enjoyment and effort of physical education students. Likewise, it…
Consequences of common data analysis inaccuracies in CNS trauma injury basic research.
Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K
2013-05-15
The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.
Interpretation of correlations in clinical research.
Hung, Man; Bounsanga, Jerry; Voss, Maren Wright
2017-11-01
Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.
Using Scientific and Industrial Films in Teaching Technical Communication.
ERIC Educational Resources Information Center
Veeder, Gerry
A film course especially designed for technical communication students can illustrate basic film concepts and techniques while showing how film effectively communicates ideas in an industrial and scientific communication system. After a basic introduction to film terms, the study of actual scientific and industrial films demonstrates the following…
Teaching Moderately Mentally Retarded Children Basic Reading Skills.
ERIC Educational Resources Information Center
Hoogeveen, Frans R.; And Others
1989-01-01
Four moderately mentally retarded students, aged 8-13, were instructed in a basic skills reading program which emphasized a phonemic alphabet, pictorial cueing, and stimulus manipulation techniques. The training improved the Dutch students' ability to read one- and two-syllable words, and was generalizable to untrained words of the same…
Small Engine Repair. Two-Stroke and Four-Stroke Cycle.
ERIC Educational Resources Information Center
Hires, Bill; And Others
This curriculum guide is intended to assist persons teaching a course in repairing two- and four-stroke cycle small engines. Addressed in the individual units of instruction are the following topics: safety, tools, fasteners, and measurement techniques; basic small engine theory (engine identification and inspection, basic engine principles and…
Girls' Touch Football, Physical Education: 5551.03.
ERIC Educational Resources Information Center
King, Kathy
This course outline is a guide for teaching basic understanding of fundamental skills and rules of girls' touch football in grades 7-12. The course format includes lectures, demonstrations, practice of basic skills, visual aids, lead-up games, presentation and practice of officiating techniques, tournaments, and written and skills tests. Course…
Using Every Pupil Response in Mathematics Instruction.
ERIC Educational Resources Information Center
Lauritzen, Carol
1985-01-01
Discusses the "Every Pupil Response" (EPR) strategy and its use in teaching basic facts, problem-solving, place value, and fractions. Basically, the technique involves children responding simultaneously to a question by holding up a card, using parts of their bodies, or stick figures. Advantages of EPR are noted. (JN)
Manual of Basic Techniques for a Health Laboratory.
ERIC Educational Resources Information Center
World Health Organization, Geneva (Switzerland).
Described are basic laboratory methods for diagnosing and investigating diseases of importance to developing countries. Intended primarily for the training of technicians who will work in peripheral laboratories, the manual is designed so that student laboratory assistants can be taught to use it with minimal supervision from a teacher. The…
Basic Laboratory Skills for Water and Wastewater Analysis. Report No. 125.
ERIC Educational Resources Information Center
Clark, Douglas W.
Designed for individuals wanting to acquire an introductory knowledge of basic skills necessary to function in a water or wastewater laboratory, this handbook emphasizes current use of routine equipment and proper procedures. Explanations and illustrations focus on underlying techniques and principles rather than processes for conducting specific…
Interferometric reflection moire
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Combell, Olivier
1995-06-01
A new reflection moire technique is introduced in this paper. The basic equations that relate the measurement of slopes to the basic geometric and optical parameters of the system are derived. The sensitivity and accuracy of the method are discussed. Examples of application to the study of silicon wafers and electronic chips are given.
Behavior Modification: Basic Principles. Third Edition
ERIC Educational Resources Information Center
Lee, David L.; Axelrod, Saul
2005-01-01
This classic book presents the basic principles of behavior emphasizing the use of preventive techniques as well as consequences naturally available in the home, business, or school environment to change important behaviors. This book, and its companion piece, "Measurement of Behavior," represents more than 30 years of research and strategies in…
ERIC Educational Resources Information Center
DARLEY, FREDERIC L.
THIS TEXT GIVES THE STUDENT AN OUTLINE OF THE BASIC PRINCIPLES OF SCIENTIFIC METHODOLOGY WHICH UNDERLIE EVALUATIVE WORK IN SPEECH DISORDERS. RATIONALE AND ASSESSMENT TECHNIQUES ARE GIVEN FOR EXAMINATION OF THE BASIC COMMUNICATION PROCESSES OF SYMBOLIZATION, RESPIRATION, PHONATION, ARTICULATION-RESONANCE, PROSODY, ASSOCIATED SENSORY AND PERCEPTUAL…
Instrumentation for Environmental Monitoring: Water, Volume 2.
ERIC Educational Resources Information Center
California Univ., Berkeley. Lawrence Berkeley Lab.
This volume is one of a series discussing instrumentation for environmental monitoring. Each volume contains an overview of the basic problems, comparisons among the basic methods of sensing and detection, and notes that summarize the characteristics of presently available instruments and techniques. The text of this survey discusses the…
Basic Understanding of Earth Tunneling by Melting : Volume 2. Earth Structure and Design Solutions.
DOT National Transportation Integrated Search
1974-07-01
A novel technique, which employs the melting of rocks and soils as a means of excavating or tunneling while simultaneously generating a glass tunnel lining and/or primary support, was studied. The object of the study was to produce a good basic under...
ERIC Educational Resources Information Center
Papaphotis, Georgios; Tsaparlis, Georgios
2008-01-01
Part 1 of the findings are presented of a quantitative study (n = 125) on basic quantum chemical concepts taught in the twelfth grade (age 17-18 years) in Greece. A paper-and-pencil test of fourteen questions was used. The study compared performance in five questions that tested recall of knowledge or application of algorithmic procedures (type-A…
Techniques in teaching statistics : linking research production and research use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinez-Moyano, I .; Smith, A.; Univ. of Massachusetts at Boston)
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between researchmore » and practice.« less
Genome wide approaches to identify protein-DNA interactions.
Ma, Tao; Ye, Zhenqing; Wang, Liguo
2018-05-29
Transcription factors are DNA-binding proteins that play key roles in many fundamental biological processes. Unraveling their interactions with DNA is essential to identify their target genes and understand the regulatory network. Genome-wide identification of their binding sites became feasible thanks to recent progress in experimental and computational approaches. ChIP-chip, ChIP-seq, and ChIP-exo are three widely used techniques to demarcate genome-wide transcription factor binding sites. This review aims to provide an overview of these three techniques including their experiment procedures, computational approaches, and popular analytic tools. ChIP-chip, ChIP-seq, and ChIP-exo have been the major techniques to study genome-wide in vivo protein-DNA interaction. Due to the rapid development of next-generation sequencing technology, array-based ChIP-chip is deprecated and ChIP-seq has become the most widely used technique to identify transcription factor binding sites in genome-wide. The newly developed ChIP-exo further improves the spatial resolution to single nucleotide. Numerous tools have been developed to analyze ChIP-chip, ChIP-seq and ChIP-exo data. However, different programs may employ different mechanisms or underlying algorithms thus each will inherently include its own set of statistical assumption and bias. So choosing the most appropriate analytic program for a given experiment needs careful considerations. Moreover, most programs only have command line interface so their installation and usage will require basic computation expertise in Unix/Linux. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Recent developments of axial flow compressors under transonic flow conditions
NASA Astrophysics Data System (ADS)
Srinivas, G.; Raghunandana, K.; Satish Shenoy, B.
2017-05-01
The objective of this paper is to give a holistic view of the most advanced technology and procedures that are practiced in the field of turbomachinery design. Compressor flow solver is the turbulence model used in the CFD to solve viscous problems. The popular techniques like Jameson’s rotated difference scheme was used to solve potential flow equation in transonic condition for two dimensional aero foils and later three dimensional wings. The gradient base method is also a popular method especially for compressor blade shape optimization. Various other types of optimization techniques available are Evolutionary algorithms (EAs) and Response surface methodology (RSM). It is observed that in order to improve compressor flow solver and to get agreeable results careful attention need to be paid towards viscous relations, grid resolution, turbulent modeling and artificial viscosity, in CFD. The advanced techniques like Jameson’s rotated difference had most substantial impact on wing design and aero foil. For compressor blade shape optimization, Evolutionary algorithm is quite simple than gradient based technique because it can solve the parameters simultaneously by searching from multiple points in the given design space. Response surface methodology (RSM) is a method basically used to design empirical models of the response that were observed and to study systematically the experimental data. This methodology analyses the correct relationship between expected responses (output) and design variables (input). RSM solves the function systematically in a series of mathematical and statistical processes. For turbomachinery blade optimization recently RSM has been implemented successfully. The well-designed high performance axial flow compressors finds its application in any air-breathing jet engines.
Detecting dark matter in the Milky Way with cosmic and gamma radiation
NASA Astrophysics Data System (ADS)
Carlson, Eric C.
Over the last decade, experiments in high-energy astroparticle physics have reached unprecedented precision and sensitivity which span the electromagnetic and cosmic-ray spectra. These advances have opened a new window onto the universe for which little was previously known. Such dramatic increases in sensitivity lead naturally to claims of excess emission, which call for either revised astrophysical models or the existence of exotic new sources such as particle dark matter. Here we stand firmly with Occam, sharpening his razor by (i) developing new techniques for discriminating astrophysical signatures from those of dark matter, and (ii) by developing detailed foreground models which can explain excess signals and shed light on the underlying astrophysical processes at hand. We concentrate most directly on observations of Galactic gamma and cosmic rays, factoring the discussion into three related parts which each contain significant advancements from our cumulative works. In Part I we introduce concepts which are fundamental to the Indirect Detection of particle dark matter, including motivations, targets, experiments, production of Standard Model particles, and a variety of statistical techniques. In Part II we introduce basic and advanced modelling techniques for propagation of cosmic-rays through the Galaxy and describe astrophysical gamma-ray production, as well as presenting state-of-the-art propagation models of the Milky Way.Finally, in Part III, we employ these models and techniques in order to study several indirect detection signals, including the Fermi GeV excess at the Galactic center, the Fermi 135 GeV line, the 3.5 keV line, and the WMAP-Planck haze.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Huaying, E-mail: zhaoh3@mail.nih.gov; Schuck, Peter, E-mail: zhaoh3@mail.nih.gov
2015-01-01
Global multi-method analysis for protein interactions (GMMA) can increase the precision and complexity of binding studies for the determination of the stoichiometry, affinity and cooperativity of multi-site interactions. The principles and recent developments of biophysical solution methods implemented for GMMA in the software SEDPHAT are reviewed, their complementarity in GMMA is described and a new GMMA simulation tool set in SEDPHAT is presented. Reversible macromolecular interactions are ubiquitous in signal transduction pathways, often forming dynamic multi-protein complexes with three or more components. Multivalent binding and cooperativity in these complexes are often key motifs of their biological mechanisms. Traditional solution biophysicalmore » techniques for characterizing the binding and cooperativity are very limited in the number of states that can be resolved. A global multi-method analysis (GMMA) approach has recently been introduced that can leverage the strengths and the different observables of different techniques to improve the accuracy of the resulting binding parameters and to facilitate the study of multi-component systems and multi-site interactions. Here, GMMA is described in the software SEDPHAT for the analysis of data from isothermal titration calorimetry, surface plasmon resonance or other biosensing, analytical ultracentrifugation, fluorescence anisotropy and various other spectroscopic and thermodynamic techniques. The basic principles of these techniques are reviewed and recent advances in view of their particular strengths in the context of GMMA are described. Furthermore, a new feature in SEDPHAT is introduced for the simulation of multi-method data. In combination with specific statistical tools for GMMA in SEDPHAT, simulations can be a valuable step in the experimental design.« less
TMS-EEG: From basic research to clinical applications
NASA Astrophysics Data System (ADS)
Hernandez-Pavon, Julio C.; Sarvas, Jukka; Ilmoniemi, Risto J.
2014-11-01
Transcranial magnetic stimulation (TMS) combined with electroencephalography (EEG) is a powerful technique for non-invasively studying cortical excitability and connectivity. The combination of TMS and EEG has widely been used to perform basic research and recently has gained importance in different clinical applications. In this paper, we will describe the physical and biological principles of TMS-EEG and different applications in basic research and clinical applications. We will present methods based on independent component analysis (ICA) for studying the TMS-evoked EEG responses. These methods have the capability to remove and suppress large artifacts, making it feasible, for instance, to study language areas with TMS-EEG. We will discuss the different applications and limitations of TMS and TMS-EEG in clinical applications. Potential applications of TMS are presented, for instance in neurosurgical planning, depression and other neurological disorders. Advantages and disadvantages of TMS-EEG and its variants such as repetitive TMS (rTMS) are discussed in comparison to other brain stimulation and neuroimaging techniques. Finally, challenges that researchers face when using this technique will be summarized.
Allen, Peter J.; Dorozenko, Kate P.; Roberts, Lynne D.
2016-01-01
Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these “experts” were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid should function as a teaching tool, which engages the user with each choice-point in the decision making process, rather than simply providing an “answer.” Based on these findings, we offer suggestions for tools and strategies that could be deployed in the research methods classroom to facilitate and strengthen students' statistical decision making abilities. PMID:26909064
Allen, Peter J; Dorozenko, Kate P; Roberts, Lynne D
2016-01-01
Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these "experts" were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid should function as a teaching tool, which engages the user with each choice-point in the decision making process, rather than simply providing an "answer." Based on these findings, we offer suggestions for tools and strategies that could be deployed in the research methods classroom to facilitate and strengthen students' statistical decision making abilities.
Statistical Symbolic Execution with Informed Sampling
NASA Technical Reports Server (NTRS)
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
Statistical regularities of art images and natural scenes: spectra, sparseness and nonlinearities.
Graham, Daniel J; Field, David J
2007-01-01
Paintings are the product of a process that begins with ordinary vision in the natural world and ends with manipulation of pigments on canvas. Because artists must produce images that can be seen by a visual system that is thought to take advantage of statistical regularities in natural scenes, artists are likely to replicate many of these regularities in their painted art. We have tested this notion by computing basic statistical properties and modeled cell response properties for a large set of digitized paintings and natural scenes. We find that both representational and non-representational (abstract) paintings from our sample (124 images) show basic similarities to a sample of natural scenes in terms of their spatial frequency amplitude spectra, but the paintings and natural scenes show significantly different mean amplitude spectrum slopes. We also find that the intensity distributions of paintings show a lower skewness and sparseness than natural scenes. We account for this by considering the range of luminances found in the environment compared to the range available in the medium of paint. A painting's range is limited by the reflective properties of its materials. We argue that artists do not simply scale the intensity range down but use a compressive nonlinearity. In our studies, modeled retinal and cortical filter responses to the images were less sparse for the paintings than for the natural scenes. But when a compressive nonlinearity was applied to the images, both the paintings' sparseness and the modeled responses to the paintings showed the same or greater sparseness compared to the natural scenes. This suggests that artists achieve some degree of nonlinear compression in their paintings. Because paintings have captivated humans for millennia, finding basic statistical regularities in paintings' spatial structure could grant insights into the range of spatial patterns that humans find compelling.
Bello, Jibril Oyekunle
2013-11-14
Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.